ARToolKit | Mailing List Archive |
![]() |
From: | "Khamene, Ali" <akhamene@s ..............> | Received: | Mar 9, 2001 |
To | "'ARforum@l ..........'" <ARforum@l ..........> | ||
Subject: | |||
FORUM ON AUGMENTED AND VIRTUAL REALITY --------------------------- ListBot Sponsor -------------------------- Build a marketing database and send targeted HTML and text e-mail newsletters to your customers with List Builder. http://www.listbuilder.com ---------------------------------------------------------------------- Call For Participation IEEE & ACM International Symposium on Augmented Reality 2001 ISAR'01 October 29-30, 2001 Columbia University, New York, NY http://www.cs.columbia.edu/graphics/isar2001 Paper submissions due by June 5, 2001 Demo submissions due by August 5, 2001 Objectives: ISAR'01 will provide an opportunity for Augmented Reality (AR) researchers from academia and industry to meet in an informal atmosphere to exchange ideas, concepts, and research results. ISAR is meant to trigger discussions among participants and to provide an intensive exchange between academic and industrial branches as well as between researchers working in the different AR research areas. History: ISAR'01 is the fourth in a series of successful events sponsored by IEEE and ACM in cooperation with Eurographics: ISAR'00: International Symposium on Augmented Reality, October 2000, Munich, Germany IWAR'99: International workshop on Augmented Reality, October 1999, San Francisco, USA IWAR'98: October 1998, San Francisco, USA Topics: * AR applications * personal AR information systems * industrial AR applications * medical AR applications * AR for Architecture * requirements for usable AR * system architecture (software and hardware design) * wearable computing * performance issues (approaches for achieving real-time AR) * distributed AR * information presentation * display hardware * real-time rendering * photorealistic rendering (e.g. reflection analysis) * object overlay * aural augmentation * Diminished Reality * Mixed Reality * sensors for position and orientation tracking * calibration methods * tracker registration methods * tracking during user motion * tracking of changes in the real world (moving objects) * computer vision methods for registration * acquisition of 3D scene descriptions * consideration of human factors * user interaction for AR (e.g. multi-modal input and output) * user acceptance of AR technology Invited Speakers: David Hawkes Guy's, King's & St Thomas' School of Medicine, UK Ulrich Neumann University of Southern California, USA Jun Rekimoto Sony Computer Science Laboratory, Japan General Chairs: Nassir Navab* Siemens Corporate Research, USA Steve Feiner * Columbia University, USA Program Chairs: Ron Azuma* HRL Laboratories, USA Gudrun Klinker* Technische Universität München, Germany Hideyuki Tamura* Mixed Reality Laboratories, Japan Demo Chair: Mihran Tuceryan Indiana University - Purdue University, USA Program Committee: Yuichiro Akatsuka Olympus Optical Company, Japan Reinhold Behringer* Rockwell Science Center, USA Mark Billinghurst University of Washington, USA Wolfgang Birkfellner University of Vienna, Austria Kostas Daniilidis University of Pennsylvania, USA Paul Devebec University of Southern California, USA Anthony DiGioia Carnegie Mellon University, USA Stephen Ellis NASA Ames Research Center, USA Andrew Fitzgibbon University of Oxford, UK Eric Foxlin Intersense Inc., USA Pascal Fua Swiss Federal Institute of Tech., Switzerland David Hawkes Guy's, King's & St Thomas' School of Medicine, UK Michitaka Hirose University of Tokyo, Japan William Hoff Colorado School of Mines, USA Tobias Höllerer Columbia University, New York, USA Simon Julier Naval Research Laboratory, USA Anthony Majoros The Boeing Company, USA Blair MacIntyre Georgia Institute of Technology, USA David Mizell* Desana Systems, California, USA Paul Milgram University of Toronto, Canada Stefan Müller* Fraunhofer IGD, Germany Ulrich Neumann* University of Southern California, USA Dirk Reiners Fraunhofer IGD/ZGDV, Germany Jun Rekimoto* Sony Computer Science Laboratory, Japan Albert Rizzo University of Southern California, USA Dieter Schmalstieg Vienna University of Technology, Austria Gilles Simon LORIA-INRIA Lorraine, France Andrei State University of North Carolina Chapel Hill, USA Didier Stricker* Fraunhofer IGD, Germany V Sundareswaran Rockwell Science Center, USA Haruo Takemura* Nara Institute of Science & Technology, Japan Mihran Tuceryan Indiana University - Purdue University, USA Jim Vallino Rochester Institute of Technology, USA Hiroyuki Yamamoto Mixed Reality Systems Laboratory, Japan Andrew Zisserman University of Oxford, UK * Member of ISAR steering committee ______________________________ SIEMENS Ali Khamene, Ph.D. Tel: (609) 734-6553 Imaging and Visualization Fax: (609) 734-6565 755 College Road East akhamene@s .............. Princeton NJ 08540 ______________________________ ______________________________________________________________________ To unsubscribe, write to ARforum-unsubscribe@l .......... |
From: | Junaidi <junaidia@y ........> | Received: | Aug 14, 2001 |
To | Hirokazu Kato <kato@s ........................>, Jeremy Goslin <jeremy@j ...............>, Mark Billinghurst <grof@h ..................> | ||
Subject: | |||
Hi, I am a new ARToolKit user. On this website, (http://www.hitl.washington.edu/people/grof/SharedSpace/Download/ARToolKitPC.htm) I've downloaded the ARToolKitPCTest.zip file and uncompressed it. When I double clicked on the ExCamera.exe icon, a window appearing with live video in one corner, but the image is not upside down. But when I double clicked on the simple.exe icon, the image is upside down and when I pointed my camera at the tracking patterns I couldn't see the OpenGL primitives as described. Was it because the image was upside down or because of something else? Any suggestion? For your information, I am using WIndows 2000, Hauppauge WinTV-GO model 607, and PULNIX TM-765 camera which is black and white. Many thanks. Joe. __________________________________________________ Do You Yahoo!? Make international calls for as low as $.04/minute with Yahoo! Messenger http://phonecard.yahoo.com/ |
From: | Eric Seibel <eseibel@h ..................> | Received: | Aug 27, 2001 |
To | arforum@t ......... | ||
Subject: | |||
************************************************* Eric J. Seibel, PhD Research Assistant Professor, Mechanical Engineering, and Assistant Director for Technology Development at the Human Interface Technology Lab, 215 Fluke Hall University of Washington, Box 352142 Seattle, WA, USA 98195-2142 voice: (206) 616-1486 fax: (206) 543-5380 e-mail: eseibel@h .................. ==^================================================================ EASY UNSUBSCRIBE click here: http://topica.com/u/?a84Ao5.a9zIQj Or send an email To: arforum-unsubscribe@t ......... This email was sent to: webmaster@e ............ T O P I C A -- Register now to manage your mail! http://www.topica.com/partner/tag02/register ==^================================================================ |
From: | "Arnaud Chanonier" <achanonier@h ..........> | Received: | Mar 28, 2002 |
To | artoolkit@h .................. | ||
Subject: | |||
Hi I'm a french student and I'm new to this board. I'm doing an internship on augmented reality. Right now I have a working prototype of see through AR using World ToolKit and a Flock of Bird for tracking purpose. I now want to implement a prototype with camera tracking instead of electro magnetic. I have really no idea how to start, any help or advice would be appreciated. I don't have the camera yet, but it's gonna be a Sony Micro cam with a matrox Meteor 2. Is there a special focal length for the camera. Will a 6mm lens be OK with ARToolkit calibration? Thanks in advance. Arnaud _________________________________________________________________ Téléchargez MSN Explorer gratuitement à l'adresse http://explorer.msn.fr/intl.asp. |
From: | "sune" <sunek@d ..........> | Received: | Jan 3, 2003 |
To | artoolkit@h .................. | ||
Subject: | |||
Hello If I have two (or more) of the same marker and I would like to have a VRML animation for each marker, how can this be solved. I have been trying to modify the example program simpleVRML but I have not succeeded yet. <?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" /> I know that the function arDetectMarker() returns information of markers that are recognized by image processing. If there are many markers that has same pattern, this function returns information of all same markers. So I should deal with this in a way that makes it possible to show the same animation more than ones if more than one of the same marker are seen by the camera. But how? Have any of you tried this before? If yes a code sample would be really nice. Best regards Sune Kristensen |
From: | Stuart Reeves <str00u@c ............> | Received: | Mar 5, 2003 |
To | ARToolkit List <artoolkit@h ..................> | ||
Subject: | |||
Hi, I've tried to use two arDetectMarker functions using the same data pointer to the image, but i've found that subsequent arGetTransMat calls when also using arMultiGetTransMat on the same image doesn't work. For instance, i have: while () { ... // Grab a video frame if ((dataPtr = (ARUint8*)arVideoGetImage()) == NULL) { arUtilSleep(2); //return; } ... if (arDetectMarkerLite(dataPtr, THRESHOLD, &locationMarkers, &numLocationMarkers) < 0 && arDetectMarker(dataPtr, THRESHOLD, &entityMarkers, &numEntityMarkers) < 0 ) { cleanup(); break; } ... } I've found that when using the above code, only arMultiGetTransMat can actually detect markers. Using arGetTransMat in the same loop detects no markers. If i comment out the arDetectMarkerLite call, then the arGetTransMat function returns results and correctly detects markers. Does anyone know why this is? Thanks. later, Stuart --- email: stuart@t ............ my pointless, self-congratulatory vanity site is located here: http://www.nontrivial.uklinux.net [http://lns.sf.net for code] |
From: | Stuart Reeves <str00u@c ............> | Received: | Mar 5, 2003 |
To | ARToolkit List <artoolkit@h ..................> | ||
Subject: | |||
Hi, I've tried to use two arDetectMarker functions using the same data pointer to the image, but i've found that subsequent arGetTransMat calls when also using arMultiGetTransMat on the same image doesn't work. For instance, i have: while () { ... // Grab a video frame if ((dataPtr = (ARUint8*)arVideoGetImage()) == NULL) { arUtilSleep(2); //return; } ... if (arDetectMarkerLite(dataPtr, THRESHOLD, &locationMarkers, &numLocationMarkers) < 0 && arDetectMarker(dataPtr, THRESHOLD, &entityMarkers, &numEntityMarkers) < 0 ) { cleanup(); break; } ... } I've found that when using the above code, only arMultiGetTransMat can actually detect markers. Using arGetTransMat in the same loop detects no markers. If i comment out the arDetectMarkerLite call, then the arGetTransMat function returns results and correctly detects markers. Does anyone know why this is? Thanks. later, Stuart --- email: stuart@t ............ my pointless, self-congratulatory vanity site is located here: http://www.nontrivial.uklinux.net [http://lns.sf.net for code] |
From: | "Wayne Piekarski" <wayne@c ..............> | Received: | Mar 6, 2003 |
To | "Stuart" <stuart@t ............>, "ARToolkit List" <artoolkit@h ..................> | ||
Subject: | Re: | ||
When you call arDetectMarker you only pass in a pointer which it then points to its internal array of markers. The problem is that this is a global variable and so it gets rewritten each call. Between the two calls you need to copy out all the data that you need before making the second call. This same problem also prevents ARToolkit from being used within two separate threads simultaneously. It would be really cool if the structure was passed in as an argument (instead of global) that way there would be no re-entrancy problem and it could be used in multiple instances easily. regards, Wayne ---------------------------------------------------------------------------- Wayne Piekarski - PhD Researcher / Lecturer pho: +61-8-8302-3669 fax: +61-8-8302-3381 Tinmith Project - Wearable Computer Lab mob: 0407-395-889 Advanced Computing Research Centre ema: wayne@c .............. University of South Australia www: http://www.tinmith.net ----- Original Message ----- From: "Stuart Reeves" <str00u@c ............> To: "ARToolkit List" <artoolkit@h ..................> Sent: Thursday, March 06, 2003 3:17 AM > Hi, > > I've tried to use two arDetectMarker functions using the same data pointer > to the image, but i've found that subsequent arGetTransMat calls when > also using arMultiGetTransMat on the same image doesn't work. > > For instance, i have: > > while () { > > ... > > // Grab a video frame > if ((dataPtr = (ARUint8*)arVideoGetImage()) == NULL) { > arUtilSleep(2); > //return; > } > > ... > > if (arDetectMarkerLite(dataPtr, THRESHOLD, &locationMarkers, > &numLocationMarkers) < 0 && > arDetectMarker(dataPtr, THRESHOLD, &entityMarkers, > &numEntityMarkers) < 0 ) { > cleanup(); > break; > } > > ... > > } > > I've found that when using the above code, only arMultiGetTransMat can > actually detect markers. Using arGetTransMat in the same loop detects no > markers. > > If i comment out the arDetectMarkerLite call, then the arGetTransMat > function returns results and correctly detects markers. Does anyone know > why this is? > > Thanks. > > later, > Stuart > --- > email: stuart@t ............ > > my pointless, self-congratulatory vanity site is located here: > http://www.nontrivial.uklinux.net [http://lns.sf.net for code] > > > |
From: | "Wayne Piekarski" <wayne@c ..............> | Received: | Mar 6, 2003 |
To | "Stuart" <stuart@t ............>, "ARToolkit List" <artoolkit@h ..................> | ||
Subject: | Re: | ||
When you call arDetectMarker you only pass in a pointer which it then points to its internal array of markers. The problem is that this is a global variable and so it gets rewritten each call. Between the two calls you need to copy out all the data that you need before making the second call. This same problem also prevents ARToolkit from being used within two separate threads simultaneously. It would be really cool if the structure was passed in as an argument (instead of global) that way there would be no re-entrancy problem and it could be used in multiple instances easily. regards, Wayne ---------------------------------------------------------------------------- Wayne Piekarski - PhD Researcher / Lecturer pho: +61-8-8302-3669 fax: +61-8-8302-3381 Tinmith Project - Wearable Computer Lab mob: 0407-395-889 Advanced Computing Research Centre ema: wayne@c .............. University of South Australia www: http://www.tinmith.net ----- Original Message ----- From: "Stuart Reeves" <str00u@c ............> To: "ARToolkit List" <artoolkit@h ..................> Sent: Thursday, March 06, 2003 3:17 AM > Hi, > > I've tried to use two arDetectMarker functions using the same data pointer > to the image, but i've found that subsequent arGetTransMat calls when > also using arMultiGetTransMat on the same image doesn't work. > > For instance, i have: > > while () { > > ... > > // Grab a video frame > if ((dataPtr = (ARUint8*)arVideoGetImage()) == NULL) { > arUtilSleep(2); > //return; > } > > ... > > if (arDetectMarkerLite(dataPtr, THRESHOLD, &locationMarkers, > &numLocationMarkers) < 0 && > arDetectMarker(dataPtr, THRESHOLD, &entityMarkers, > &numEntityMarkers) < 0 ) { > cleanup(); > break; > } > > ... > > } > > I've found that when using the above code, only arMultiGetTransMat can > actually detect markers. Using arGetTransMat in the same loop detects no > markers. > > If i comment out the arDetectMarkerLite call, then the arGetTransMat > function returns results and correctly detects markers. Does anyone know > why this is? > > Thanks. > > later, > Stuart > --- > email: stuart@t ............ > > my pointless, self-congratulatory vanity site is located here: > http://www.nontrivial.uklinux.net [http://lns.sf.net for code] > > > |
From: | "PATEL, NINA H. (JSC-SF) (NASA)" <nina.h.patel@n .......> | Received: | Mar 17, 2003 |
To | "'artoolkit@h ..................'" <artoolkit@h ..................> | ||
Subject: | |||
Nina H. Patel Penn State University CO-OP NASA Johnson Space Center Space and Life Sciences, Human Factors Bldg 15, Graphics Research Analysis Facility (GRAF Lab) X- 33667 (281/243-3667) Cell: 267.250.3897 Apt: 713.432.7398 nina.h.patel@n ....... "To most people, the sky is the limit.. To those who love flight, the sky is home". |
From: | "PATEL, NINA H. (JSC-SF) (NASA)" <nina.h.patel@n .......> | Received: | Mar 17, 2003 |
To | "'artoolkit@h ..................'" <artoolkit@h ..................> | ||
Subject: | |||
Nina H. Patel Penn State University CO-OP NASA Johnson Space Center Space and Life Sciences, Human Factors Bldg 15, Graphics Research Analysis Facility (GRAF Lab) X- 33667 (281/243-3667) Cell: 267.250.3897 Apt: 713.432.7398 nina.h.patel@n ....... "To most people, the sky is the limit.. To those who love flight, the sky is home". |
From: | lim kianhong <spartan_kings@y ........> | Received: | Jun 11, 2005 |
To | artoolkit@h .................. | ||
Subject: | |||
--0-1881104778-1118492058=:20873 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: 8bit I'm a newbie with ARToolkit. There is something I wish to know about the toolkit.Correct me if I'm wrong--(1)first I need to calibrate the camera to get the perspective projection matrix (2)then I need to do something to get the transformation matrix from marker coordinate system to camera coordinate system. I'm not sure about how to get Tcm so can anyone direct me to any link which can explain the theory behind the thing. Thank you. --------------------------------- Discover Yahoo! Have fun online with music videos, cool games, IM & more. Check it out! --0-1881104778-1118492058=:20873 Content-Type: text/html; charset=iso-8859-1 Content-Transfer-Encoding: 8bit <DIV>I'm a newbie with ARToolkit.</DIV> <DIV> </DIV> <DIV>There is something I wish to know about the toolkit.Correct me if I'm wrong--(1)first I need to calibrate the camera to get the perspective projection matrix (2)then I need to do something to get the transformation matrix from marker coordinate system to camera coordinate system.</DIV> <DIV> </DIV> <DIV>I'm not sure about how to get Tcm so can anyone direct me to any link which can explain the theory behind the thing.</DIV> <DIV> </DIV> <DIV>Thank you.</DIV><p> <hr size=1>Discover Yahoo!<br> Have fun online with music videos, cool games, IM & more. <a href="http://us.rd.yahoo.com/evt=32660/*http://discover.yahoo.com/online.html">Check it out!</a> --0-1881104778-1118492058=:20873-- |
From: | Ronald Sidharta <ronalds@g ........> | Received: | Jun 11, 2005 |
To | lim kianhong <spartan_kings@y ........>, artoolkit@h .................. | ||
Subject: | Re: | ||
Lim, You are correct on both of them. For your first point, I think in general the default perspective projection matrix file that came with ARToolkit is sufficient with many basic AR things you want to do. If you read the docs/mailing list, you'll find that the default camera parameter it's sufficient for putting virtual objects on top of the marker. However, if you need precise accuracy (such as for visual registration), I believe you need to do the camera calibration to get your actual camera's parameter. For your second point, you are also correct. Using ARToolkit, you can get Tcm, which is the transformation matrix from marker to camera. I am expecting you already know how to get Tcm by coding, right? The theory to get Tcm is best explained in this paper Kato, H., Billinghurst, M., "Marker Tracking and HMD Calibration For a Video Based Augmented Reality Conferencing System" You can find that paper by asking Mr. Google. Ronald On 6/11/05, lim kianhong <spartan_kings@y ........> wrote: > I'm a newbie with ARToolkit.=20 > =20 > There is something I wish to know about the toolkit.Correct me if I'm > wrong--(1)first I need to calibrate the camera to get the perspective > projection matrix (2)then I need to do something to get the transformatio= n > matrix from marker coordinate system to camera coordinate system.=20 > =20 > I'm not sure about how to get Tcm so can anyone direct me to any link whi= ch > can explain the theory behind the thing.=20 > =20 > Thank you. >=20 > ________________________________ > Discover Yahoo! > Have fun online with music videos, cool games, IM & more. Check it out!= =20 >=20 > |
From: | tenankesi <tenankesii@y ........> | Received: | Sep 25, 2005 |
To | artoolkit@h .................. | ||
Subject: | |||
--0-1448051159-1127655452=:6379 Content-Type: text/plain; charset=iso-8859-1 Content-Transfer-Encoding: 8bit please exclude me from the mailing list! thanks __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com --0-1448051159-1127655452=:6379 Content-Type: text/html; charset=iso-8859-1 Content-Transfer-Encoding: 8bit <DIV>please exclude me from the mailing list! thanks</DIV><p>__________________________________________________<br>Do You Yahoo!?<br>Tired of spam? Yahoo! Mail has the best spam protection around <br>http://mail.yahoo.com --0-1448051159-1127655452=:6379-- |
From: | "Mingwei Shen" <mingwei82@g ........> | Received: | Oct 26, 2006 |
To | artoolkit@h .................. | ||
Subject: | |||
------=_Part_112594_13001446.1161860278502 Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit Content-Disposition: inline Err.. forget my previous post. I found the reason... =P Mingwei ------=_Part_112594_13001446.1161860278502 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline Err.. forget my previous post. I found the reason...<br><br>=P<br><br>Mingwei<br> ------=_Part_112594_13001446.1161860278502-- |