ARToolKit | Mailing List Archive |
![]() |
From: | Mark Billinghurst <grof@h ..................> | Received: | Jan 11, 2002 |
To | artoolkit@h .................. | ||
Subject: | Happy New Year ! | ||
Happy New Year to all the ARToolKit users out there.. The big news for me is that I have finally defended my PhD thesis, so that now I can give the ARToolKit website and documentation the major overhall that it needs - look for some changes soon ! I have uploaded ARToolKit version 2.52 for Windows to the website (http://www.hitl.washington.edu/artoolkit/). There are three big differences between this and the last version of ARToolKit (2.431): * Support for multiple marker tracking - you can now track off a set of markers on the same sheet of paper which helps with some of the occlusion problem. See multi.exe in the bin directory - but print out patterns/multiPatt.pdf first. * A new calibration technique that is faster for camera calibration - try camera_calib2.exe - but print out patterns/calib_dist.pdf first. * A DirectShow version of ARToolKit - this has been a very long time in coming, but thanks to the expert programming of Brian Cross it's here. Download ARToolKitDirectShow2.52.zip to give it a try. You'll need to have the DirectX 8 runtime libraries installed to run the demos and the DirectShow SDK to compile the code. The DirectShow version seems to run at least as twice as fast as the older Vision SDK code. Please give these a try and let me know if you have any questions or problems. There is a README.txt file with each 2.52 download, but no other documentation. I'm working on that. Thanks to Hirokazu Kato for all his excellent work on the new version of ARToolKit. In the next week I will put up the following code: * ARToolKit 2.52 for Linux * A Firewire version of ARToolKit for Linux * A VRML parser/render for Linux * A cool application from Pablo Gussmann and his team at Seimens that lets you use .avi files for video input rather than a live video stream. Another item of note is that Dieter Schmalstieg and I will be presenting a half day tutorial on Augmented Reality at the VR 2002 conference in Orlando at the end of March. I'll be showing some ARToolKit demos as well as distributing CDs with the software on it. Tell all your friends to come - see www.vr2002.org for more information. Anyway - I hope you're all having a great start to the year and please let us know what improvements you'd like to see in ARToolKit. Cheers, Mark PS See http://www.hitl.washington.edu/artoolkit/ for subscribing/unsubscribing information.. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Mark Billinghurst | Human Interface Technology Laboratory grof@h .................. | University of Washington, Box 352-142 fax: +1-206-543-5380 | Seattle, WA 98195 |
From: | Gerhard Reitmayr <reitmayr@i ...............> | Received: | Jan 12, 2002 |
To | Mark Billinghurst <grof@h ..................> | ||
Subject: | Re: Happy New Year ! | ||
This is a multi-part message in MIME format. --------------020601060100080109060704 Content-Type: Content-Transfer-Encoding: 8bit Happy New Year to everyone ! Thanks Mark, this is great work. I love the DirectShow version, finally no tweaks anymore to use our firewire cameras :). I added a little extension to the ARFrameGrabber class to do vertical or horizontal image flips as well as the current available rotation. I needed that, because my images where flipped vertically and not rotated. I attached the modified files, please use them as you like. bye, Gerhard Mark Billinghurst wrote: > Happy New Year to all the ARToolKit users out there.. > > The big news for me is that I have finally defended my PhD thesis, so that > now I can give the ARToolKit website and documentation the major overhall > that it needs - look for some changes soon ! > > I have uploaded ARToolKit version 2.52 for Windows to the website > (http://www.hitl.washington.edu/artoolkit/). There are three big > differences between this and the last version of ARToolKit (2.431): > > * Support for multiple marker tracking - you can now track off a set of > markers on the same sheet of paper which helps with some of the occlusion > problem. See multi.exe in the bin directory - but print out > patterns/multiPatt.pdf first. > > * A new calibration technique that is faster for camera calibration - try > camera_calib2.exe - but print out patterns/calib_dist.pdf first. > > * A DirectShow version of ARToolKit - this has been a very long time in > coming, but thanks to the expert programming of Brian Cross it's here. > Download ARToolKitDirectShow2.52.zip to give it a try. You'll need to have > the DirectX 8 runtime libraries installed to run the demos and the > DirectShow SDK to compile the code. The DirectShow version seems to run at > least as twice as fast as the older Vision SDK code. > > Please give these a try and let me know if you have any questions > or problems. There is a README.txt file with each 2.52 download, > but no other documentation. I'm working on that. > > Thanks to Hirokazu Kato for all his excellent work on the new version of > ARToolKit. > > In the next week I will put up the following code: > * ARToolKit 2.52 for Linux > * A Firewire version of ARToolKit for Linux > * A VRML parser/render for Linux > * A cool application from Pablo Gussmann and his team at Seimens > that lets you use .avi files for video input rather than a live > video stream. > > Another item of note is that Dieter Schmalstieg and I will be presenting a > half day tutorial on Augmented Reality at the VR 2002 conference in > Orlando at the end of March. I'll be showing some ARToolKit demos as well > as distributing CDs with the software on it. Tell all your friends to come > - see www.vr2002.org for more information. > > Anyway - I hope you're all having a great start to the year and please let > us know what improvements you'd like to see in ARToolKit. > > Cheers, > Mark > > PS See http://www.hitl.washington.edu/artoolkit/ for > subscribing/unsubscribing information.. > > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > Mark Billinghurst | Human Interface Technology Laboratory > grof@h .................. | University of Washington, Box 352-142 > fax: +1-206-543-5380 | Seattle, WA 98195 > > > > > -- Gerhard Reitmayr mailto:reitmayr@i ............... tel:++ 43 1 58801 18856 Interactive Media Systems Group Vienna University of Technology --------------020601060100080109060704 Content-Type: text/plain; charset=windows-1252; name="ARFrameG.h" Content-Transfer-Encoding: 7bit Content-Disposition: inline; filename="ARFrameG.h" // ARFrameGrabber.h: interface for the ARFrameGrabber class. // ////////////////////////////////////////////////////////////////////// #if !defined(AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_) #define AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_ #if _MSC_VER > 1000 #pragma once #endif // _MSC_VER > 1000 #define WIN32_LEAN_AND_MEAN #include <windows.h> #include <dshow.h> //#include <streams.h> #include <atlbase.h> #include <qedit.h> class SmartString { public: SmartString():str(NULL) {} SmartString(char* pStr):str(NULL) { if (pStr) { int size = strlen(pStr); str = new char[size+1]; strcpy(str, pStr); } } SmartString(SmartString& sStr) { SetString(sStr.GetBuffer()); } ~SmartString() { if (str) delete[] str; } SmartString& operator =(char* pStr) { SetString(pStr); return *this; } SmartString& operator =(SmartString& sStr) { SetString(sStr.GetBuffer()); return *this; } char* GetBuffer() {return str;} protected: void SetString(char *pStr) { if (str) delete[] str; if (!pStr) { str = NULL; } else { int size = strlen(pStr); str = new char[size + 1]; strcpy(str, pStr); } } char* str; }; struct DeviceInfo { DeviceInfo():next(NULL), deviceId(-1) { } ~DeviceInfo() { if (next) delete next; } SmartString friendlyName; int deviceId; DeviceInfo* next; }; class ARFrameGrabber { public: ARFrameGrabber(); virtual ~ARFrameGrabber(); void Init(int deviceId); void BindFilter(int deviceId, IBaseFilter **pFilter); void GrabFrame(long* size, long** pBuffer); void GrabFrame(); void Grab32BitFrame(); long GetBufferSize() {return bufferSize;} long* GetBuffer() {return pBuffer;} void SetFlippedImageHorizontal(bool flag) {flipImageH = flag;} void SetFlippedImageVertical(bool flag) {flipImageV = flag;} void SetFlippedImage(bool flag) {flipImageV =flag; flipImageH = flag;} void DisplayProperties(); void EnumDevices(DeviceInfo *head); protected: CComPtr<IGraphBuilder> pGraph; CComPtr<IBaseFilter> pDeviceFilter; CComPtr<IMediaControl> pMediaControl; CComPtr<IBaseFilter> pSampleGrabberFilter; CComPtr<ISampleGrabber> pSampleGrabber; CComPtr<IPin> pGrabberInput; CComPtr<IPin> pGrabberOutput; CComPtr<IPin> pCameraOutput; CComPtr<IMediaEvent> pMediaEvent; CComPtr<IBaseFilter> pNullFilter; CComPtr<IPin> pNullInputPin; void FlipImage(long* pBuf); private: void ReportError(char *msg); bool flipImageH; bool flipImageV; long bufferSize; long *pBuffer; }; #endif // !defined(AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_) --------------020601060100080109060704 Content-Type: text/plain; charset=windows-1252; name="ARFrameG.cpp" Content-Transfer-Encoding: 7bit Content-Disposition: inline; filename="ARFrameG.cpp" // ARFrameGrabber.cpp: implementation of the ARFrameGrabber class. // ////////////////////////////////////////////////////////////////////// #include "stdafx.h" #include "ARFrameGrabber.h" ////////////////////////////////////////////////////////////////////// // Construction/Destruction ////////////////////////////////////////////////////////////////////// ARFrameGrabber::ARFrameGrabber():pBuffer(NULL), bufferSize(0), flipImageV(false), flipImageH(false) { } ARFrameGrabber::~ARFrameGrabber() { pMediaControl->Stop(); } void ARFrameGrabber::Init(int deviceId) { HRESULT hr = S_OK; // Create the Filter Graph Manager. hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC, IID_IGraphBuilder, (void **)&pGraph); hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter); hr = pGraph->QueryInterface(IID_IMediaControl, (void **) &pMediaControl); hr = pGraph->QueryInterface(IID_IMediaEvent, (void **) &pMediaEvent); hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (LPVOID*) &pNullFilter); hr = pGraph->AddFilter(pNullFilter, L"NullRenderer"); hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber); AM_MEDIA_TYPE mt; ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE)); mt.majortype = MEDIATYPE_Video; mt.subtype = MEDIASUBTYPE_RGB32; mt.formattype = FORMAT_VideoInfo; hr = pSampleGrabber->SetMediaType(&mt); pGraph->AddFilter(pSampleGrabberFilter, L"Grabber"); // Bind Device Filter. We know the device because the id was passed in BindFilter(deviceId, &pDeviceFilter); pGraph->AddFilter(pDeviceFilter, NULL); CComPtr<IEnumPins> pEnum; pDeviceFilter->EnumPins(&pEnum); hr = pEnum->Reset(); hr = pEnum->Next(1, &pCameraOutput, NULL); pEnum = NULL; pSampleGrabberFilter->EnumPins(&pEnum); pEnum->Reset(); hr = pEnum->Next(1, &pGrabberInput, NULL); pEnum = NULL; pSampleGrabberFilter->EnumPins(&pEnum); pEnum->Reset(); pEnum->Skip(1); hr = pEnum->Next(1, &pGrabberOutput, NULL); pEnum = NULL; pNullFilter->EnumPins(&pEnum); pEnum->Reset(); hr = pEnum->Next(1, &pNullInputPin, NULL); hr = pGraph->Connect(pCameraOutput, pGrabberInput); hr = pGraph->Connect(pGrabberOutput, pNullInputPin); // hr = pGraph->Render(pGrabberOutput); if (FAILED(hr)) { switch(hr) { case VFW_S_NOPREVIEWPIN : break; case E_FAIL : break; case E_INVALIDARG : break; case E_POINTER : break; } } pSampleGrabber->SetBufferSamples(TRUE); pSampleGrabber->SetOneShot(TRUE); } void ARFrameGrabber::GrabFrame(long* size, long** pBuffer) { if (!size) return; // don't want to leak mem, pBuffer must be NULL if (!pBuffer || *pBuffer) return; long evCode; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(size, NULL); if (*size) { *pBuffer = new long[*size]; } pSampleGrabber->GetCurrentBuffer(size, *pBuffer); } void ARFrameGrabber::GrabFrame() { long evCode; long size = 0; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(&size, NULL); // if buffer is not the same size as before, create a new one if (size != bufferSize) { if (pBuffer) delete[] pBuffer; bufferSize = size; pBuffer = new long[bufferSize]; } pSampleGrabber->GetCurrentBuffer(&size, pBuffer); if (flipImageV | flipImageH) FlipImage(pBuffer); } void ARFrameGrabber::FlipImage(long* pBuf) { DWORD *ptr = (DWORD*)pBuf; int pixelCount = bufferSize/4; if (!pBuf) return; // Added code for more image manipulations // NOTE : hardcoded image size of 320 x 240 ! // I guess these should be members of ARFrameGrabber and // set during initialization // Gerhard Reitmayr <reitmayr@i ............... int sizeX = 320; int sizeY = 240; if( flipImageV ) { if( flipImageH ) { // both flips set -> rotation about 180 degree for (int index = 0; index < pixelCount/2; index++) { ptr[index] = ptr[index] ^ ptr[pixelCount - index - 1]; ptr[pixelCount - index - 1] = ptr[index] ^ ptr[pixelCount - index - 1]; ptr[index] = ptr[index] ^ ptr[pixelCount - index - 1]; } } else { // only vertical flip for( int line = 0; line < sizeY/2; line++ ) for( int pixel = 0; pixel < sizeX; pixel ++ ) { ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; ptr[pixelCount - line*sizeX - (sizeX - pixel )] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; } } } else { if( flipImageH ) { // only horizontal flip for( int line = 0; line < sizeY; line++ ) for( int pixel = 0; pixel < sizeX/2; pixel ++ ) { ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; ptr[line*sizeX + (sizeX - pixel )] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; } } } } void ARFrameGrabber::Grab32BitFrame() { long evCode; long size = 0; long* pData; unsigned char* pTemp; unsigned char* ptr; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(&size, NULL); if (size != bufferSize) { if (pBuffer) delete[] pBuffer; bufferSize = size/3*4; // add space for padding pBuffer = new long[bufferSize]; } pData= (long*) new unsigned char[size]; pSampleGrabber->GetCurrentBuffer(&size, pData); ptr = ((unsigned char*)pBuffer) + bufferSize - 1; pTemp = (unsigned char*) pData; // do the padding for (int index = 0; index < size/3; index++) { unsigned char r = *(pTemp++); unsigned char g = *(pTemp++); unsigned char b = *(pTemp++); *(ptr--) = 0; *(ptr--) = b; *(ptr--) = g; *(ptr--) = r; } /* for (int index = 0; index < size; index++) { *ptr = ((unsigned char *)pTemp)[index]; ptr--; if (index % 3 == 2) { *ptr = 0; ptr--; } } */ delete[] pData; } void ARFrameGrabber::BindFilter(int deviceId, IBaseFilter **pFilter) { if (deviceId < 0) return; // enumerate all video capture devices CComPtr<ICreateDevEnum> pCreateDevEnum; HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (void**)&pCreateDevEnum); if (hr != NOERROR) { // ErrMsg("Error Creating Device Enumerator"); return; } CComPtr<IEnumMoniker> pEm; hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEm, 0); if (hr != NOERROR) { // ErrMsg("Sorry, you have no video capture hardware"); return; } pEm->Reset(); ULONG cFetched; IMoniker *pM; int index = 0; while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId) { IPropertyBag *pBag; hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag); if(SUCCEEDED(hr)) { VARIANT var; var.vt = VT_BSTR; hr = pBag->Read(L"FriendlyName", &var, NULL); if (hr == NOERROR) { if (index == deviceId) { pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter); } SysFreeString(var.bstrVal); } pBag->Release(); } pM->Release(); index++; } } void ARFrameGrabber::DisplayProperties() { CComPtr<ISpecifyPropertyPages> pPages; HRESULT hr = pCameraOutput->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pPages); if (SUCCEEDED(hr)) { PIN_INFO PinInfo; pCameraOutput->QueryPinInfo(&PinInfo); CAUUID caGUID; pPages->GetPages(&caGUID); OleCreatePropertyFrame( NULL, 0, 0, L"Property Sheet", 1, (IUnknown **)&(pCameraOutput.p), caGUID.cElems, caGUID.pElems, 0, 0, NULL); CoTaskMemFree(caGUID.pElems); PinInfo.pFilter->Release(); } } void ARFrameGrabber::EnumDevices(DeviceInfo *head) { if (!head) return; DeviceInfo *ptr = head; int id = 0; // enumerate all video capture devices CComPtr<ICreateDevEnum> pCreateDevEnum; // ICreateDevEnum *pCreateDevEnum; HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (void**)&pCreateDevEnum); if (hr != NOERROR) { // ErrMsg("Error Creating Device Enumerator"); return; } CComPtr<IEnumMoniker> pEm; hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEm, 0); if (hr != NOERROR) { // ErrMsg("Sorry, you have no video capture hardware"); return; } pEm->Reset(); ULONG cFetched; IMoniker *pM; while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK) { IPropertyBag *pBag; hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag); if(SUCCEEDED(hr)) { VARIANT var; var.vt = VT_BSTR; hr = pBag->Read(L"FriendlyName", &var, NULL); if (hr == NOERROR) { char str[2048]; if (ptr->deviceId != -1) { ptr->next = new DeviceInfo(); ptr = ptr->next; } ptr->deviceId = id++; WideCharToMultiByte(CP_ACP,0,var.bstrVal, -1, str, 2048, NULL, NULL); ptr->friendlyName = str; SysFreeString(var.bstrVal); } pBag->Release(); } pM->Release(); } } void ARFrameGrabber::ReportError(char *msg) { MessageBox(NULL, msg, "ARFrameGrabber Error", MB_ICONSTOP); } --------------020601060100080109060704-- |
From: | Gerhard Reitmayr <reitmayr@i ...............> | Received: | Jan 12, 2002 |
To | Mark Billinghurst <grof@h ..................> | ||
Subject: | Re: Happy New Year ! | ||
This is a multi-part message in MIME format. --------------000503050009000709060907 Content-Type: Content-Transfer-Encoding: 8bit Happy New Year to everyone ! Thanks Mark, this is great work. I love the DirectShow version, finally no tweaks anymore to use our firewire cameras :). I added a little extension to the ARFrameGrabber class to do vertical or horizontal image flips as well as the current available rotation. I needed that, because my images where flipped vertically and not rotated. I attached the modified files, please use them as you like. bye, Gerhard Mark Billinghurst wrote: > Happy New Year to all the ARToolKit users out there.. > > The big news for me is that I have finally defended my PhD thesis, so that > now I can give the ARToolKit website and documentation the major overhall > that it needs - look for some changes soon ! > > I have uploaded ARToolKit version 2.52 for Windows to the website > (http://www.hitl.washington.edu/artoolkit/). There are three big > differences between this and the last version of ARToolKit (2.431): > > * Support for multiple marker tracking - you can now track off a set of > markers on the same sheet of paper which helps with some of the occlusion > problem. See multi.exe in the bin directory - but print out > patterns/multiPatt.pdf first. > > * A new calibration technique that is faster for camera calibration - try > camera_calib2.exe - but print out patterns/calib_dist.pdf first. > > * A DirectShow version of ARToolKit - this has been a very long time in > coming, but thanks to the expert programming of Brian Cross it's here. > Download ARToolKitDirectShow2.52.zip to give it a try. You'll need to have > the DirectX 8 runtime libraries installed to run the demos and the > DirectShow SDK to compile the code. The DirectShow version seems to run at > least as twice as fast as the older Vision SDK code. > > Please give these a try and let me know if you have any questions > or problems. There is a README.txt file with each 2.52 download, > but no other documentation. I'm working on that. > > Thanks to Hirokazu Kato for all his excellent work on the new version of > ARToolKit. > > In the next week I will put up the following code: > * ARToolKit 2.52 for Linux > * A Firewire version of ARToolKit for Linux > * A VRML parser/render for Linux > * A cool application from Pablo Gussmann and his team at Seimens > that lets you use .avi files for video input rather than a live > video stream. > > Another item of note is that Dieter Schmalstieg and I will be presenting a > half day tutorial on Augmented Reality at the VR 2002 conference in > Orlando at the end of March. I'll be showing some ARToolKit demos as well > as distributing CDs with the software on it. Tell all your friends to come > - see www.vr2002.org for more information. > > Anyway - I hope you're all having a great start to the year and please let > us know what improvements you'd like to see in ARToolKit. > > Cheers, > Mark > > PS See http://www.hitl.washington.edu/artoolkit/ for > subscribing/unsubscribing information.. > > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > Mark Billinghurst | Human Interface Technology Laboratory > grof@h .................. | University of Washington, Box 352-142 > fax: +1-206-543-5380 | Seattle, WA 98195 > > > > > -- Gerhard Reitmayr mailto:reitmayr@i ............... tel:++ 43 1 58801 18856 Interactive Media Systems Group Vienna University of Technology --------------000503050009000709060907 Content-Type: text/plain; charset=windows-1252; name="ARFrameG.h" Content-Transfer-Encoding: 7bit Content-Disposition: inline; filename="ARFrameG.h" // ARFrameGrabber.h: interface for the ARFrameGrabber class. // ////////////////////////////////////////////////////////////////////// #if !defined(AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_) #define AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_ #if _MSC_VER > 1000 #pragma once #endif // _MSC_VER > 1000 #define WIN32_LEAN_AND_MEAN #include <windows.h> #include <dshow.h> //#include <streams.h> #include <atlbase.h> #include <qedit.h> class SmartString { public: SmartString():str(NULL) {} SmartString(char* pStr):str(NULL) { if (pStr) { int size = strlen(pStr); str = new char[size+1]; strcpy(str, pStr); } } SmartString(SmartString& sStr) { SetString(sStr.GetBuffer()); } ~SmartString() { if (str) delete[] str; } SmartString& operator =(char* pStr) { SetString(pStr); return *this; } SmartString& operator =(SmartString& sStr) { SetString(sStr.GetBuffer()); return *this; } char* GetBuffer() {return str;} protected: void SetString(char *pStr) { if (str) delete[] str; if (!pStr) { str = NULL; } else { int size = strlen(pStr); str = new char[size + 1]; strcpy(str, pStr); } } char* str; }; struct DeviceInfo { DeviceInfo():next(NULL), deviceId(-1) { } ~DeviceInfo() { if (next) delete next; } SmartString friendlyName; int deviceId; DeviceInfo* next; }; class ARFrameGrabber { public: ARFrameGrabber(); virtual ~ARFrameGrabber(); void Init(int deviceId); void BindFilter(int deviceId, IBaseFilter **pFilter); void GrabFrame(long* size, long** pBuffer); void GrabFrame(); void Grab32BitFrame(); long GetBufferSize() {return bufferSize;} long* GetBuffer() {return pBuffer;} void SetFlippedImageHorizontal(bool flag) {flipImageH = flag;} void SetFlippedImageVertical(bool flag) {flipImageV = flag;} void SetFlippedImage(bool flag) {flipImageV =flag; flipImageH = flag;} void DisplayProperties(); void EnumDevices(DeviceInfo *head); protected: CComPtr<IGraphBuilder> pGraph; CComPtr<IBaseFilter> pDeviceFilter; CComPtr<IMediaControl> pMediaControl; CComPtr<IBaseFilter> pSampleGrabberFilter; CComPtr<ISampleGrabber> pSampleGrabber; CComPtr<IPin> pGrabberInput; CComPtr<IPin> pGrabberOutput; CComPtr<IPin> pCameraOutput; CComPtr<IMediaEvent> pMediaEvent; CComPtr<IBaseFilter> pNullFilter; CComPtr<IPin> pNullInputPin; void FlipImage(long* pBuf); private: void ReportError(char *msg); bool flipImageH; bool flipImageV; long bufferSize; long *pBuffer; }; #endif // !defined(AFX_ARFRAMEGRABBER_H__C5553937_4BAB_4FEF_B4A6_1693AB0C99E3__INCLUDED_) --------------000503050009000709060907 Content-Type: text/plain; charset=windows-1252; name="ARFrameG.cpp" Content-Transfer-Encoding: 7bit Content-Disposition: inline; filename="ARFrameG.cpp" // ARFrameGrabber.cpp: implementation of the ARFrameGrabber class. // ////////////////////////////////////////////////////////////////////// #include "stdafx.h" #include "ARFrameGrabber.h" ////////////////////////////////////////////////////////////////////// // Construction/Destruction ////////////////////////////////////////////////////////////////////// ARFrameGrabber::ARFrameGrabber():pBuffer(NULL), bufferSize(0), flipImageV(false), flipImageH(false) { } ARFrameGrabber::~ARFrameGrabber() { pMediaControl->Stop(); } void ARFrameGrabber::Init(int deviceId) { HRESULT hr = S_OK; // Create the Filter Graph Manager. hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC, IID_IGraphBuilder, (void **)&pGraph); hr = CoCreateInstance(CLSID_SampleGrabber, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (LPVOID *)&pSampleGrabberFilter); hr = pGraph->QueryInterface(IID_IMediaControl, (void **) &pMediaControl); hr = pGraph->QueryInterface(IID_IMediaEvent, (void **) &pMediaEvent); hr = CoCreateInstance(CLSID_NullRenderer, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (LPVOID*) &pNullFilter); hr = pGraph->AddFilter(pNullFilter, L"NullRenderer"); hr = pSampleGrabberFilter->QueryInterface(IID_ISampleGrabber, (void**)&pSampleGrabber); AM_MEDIA_TYPE mt; ZeroMemory(&mt, sizeof(AM_MEDIA_TYPE)); mt.majortype = MEDIATYPE_Video; mt.subtype = MEDIASUBTYPE_RGB32; mt.formattype = FORMAT_VideoInfo; hr = pSampleGrabber->SetMediaType(&mt); pGraph->AddFilter(pSampleGrabberFilter, L"Grabber"); // Bind Device Filter. We know the device because the id was passed in BindFilter(deviceId, &pDeviceFilter); pGraph->AddFilter(pDeviceFilter, NULL); CComPtr<IEnumPins> pEnum; pDeviceFilter->EnumPins(&pEnum); hr = pEnum->Reset(); hr = pEnum->Next(1, &pCameraOutput, NULL); pEnum = NULL; pSampleGrabberFilter->EnumPins(&pEnum); pEnum->Reset(); hr = pEnum->Next(1, &pGrabberInput, NULL); pEnum = NULL; pSampleGrabberFilter->EnumPins(&pEnum); pEnum->Reset(); pEnum->Skip(1); hr = pEnum->Next(1, &pGrabberOutput, NULL); pEnum = NULL; pNullFilter->EnumPins(&pEnum); pEnum->Reset(); hr = pEnum->Next(1, &pNullInputPin, NULL); hr = pGraph->Connect(pCameraOutput, pGrabberInput); hr = pGraph->Connect(pGrabberOutput, pNullInputPin); // hr = pGraph->Render(pGrabberOutput); if (FAILED(hr)) { switch(hr) { case VFW_S_NOPREVIEWPIN : break; case E_FAIL : break; case E_INVALIDARG : break; case E_POINTER : break; } } pSampleGrabber->SetBufferSamples(TRUE); pSampleGrabber->SetOneShot(TRUE); } void ARFrameGrabber::GrabFrame(long* size, long** pBuffer) { if (!size) return; // don't want to leak mem, pBuffer must be NULL if (!pBuffer || *pBuffer) return; long evCode; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(size, NULL); if (*size) { *pBuffer = new long[*size]; } pSampleGrabber->GetCurrentBuffer(size, *pBuffer); } void ARFrameGrabber::GrabFrame() { long evCode; long size = 0; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(&size, NULL); // if buffer is not the same size as before, create a new one if (size != bufferSize) { if (pBuffer) delete[] pBuffer; bufferSize = size; pBuffer = new long[bufferSize]; } pSampleGrabber->GetCurrentBuffer(&size, pBuffer); if (flipImageV | flipImageH) FlipImage(pBuffer); } void ARFrameGrabber::FlipImage(long* pBuf) { DWORD *ptr = (DWORD*)pBuf; int pixelCount = bufferSize/4; if (!pBuf) return; // Added code for more image manipulations // NOTE : hardcoded image size of 320 x 240 ! // I guess these should be members of ARFrameGrabber and // set during initialization // Gerhard Reitmayr <reitmayr@i ............... int sizeX = 320; int sizeY = 240; if( flipImageV ) { if( flipImageH ) { // both flips set -> rotation about 180 degree for (int index = 0; index < pixelCount/2; index++) { ptr[index] = ptr[index] ^ ptr[pixelCount - index - 1]; ptr[pixelCount - index - 1] = ptr[index] ^ ptr[pixelCount - index - 1]; ptr[index] = ptr[index] ^ ptr[pixelCount - index - 1]; } } else { // only vertical flip for( int line = 0; line < sizeY/2; line++ ) for( int pixel = 0; pixel < sizeX; pixel ++ ) { ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; ptr[pixelCount - line*sizeX - (sizeX - pixel )] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[pixelCount - line*sizeX - (sizeX - pixel )]; } } } else { if( flipImageH ) { // only horizontal flip for( int line = 0; line < sizeY; line++ ) for( int pixel = 0; pixel < sizeX/2; pixel ++ ) { ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; ptr[line*sizeX + (sizeX - pixel )] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; ptr[line*sizeX+pixel] = ptr[line*sizeX+pixel] ^ ptr[line*sizeX + (sizeX - pixel )]; } } } } void ARFrameGrabber::Grab32BitFrame() { long evCode; long size = 0; long* pData; unsigned char* pTemp; unsigned char* ptr; pMediaControl->Run(); pMediaEvent->WaitForCompletion(INFINITE, &evCode); pSampleGrabber->GetCurrentBuffer(&size, NULL); if (size != bufferSize) { if (pBuffer) delete[] pBuffer; bufferSize = size/3*4; // add space for padding pBuffer = new long[bufferSize]; } pData= (long*) new unsigned char[size]; pSampleGrabber->GetCurrentBuffer(&size, pData); ptr = ((unsigned char*)pBuffer) + bufferSize - 1; pTemp = (unsigned char*) pData; // do the padding for (int index = 0; index < size/3; index++) { unsigned char r = *(pTemp++); unsigned char g = *(pTemp++); unsigned char b = *(pTemp++); *(ptr--) = 0; *(ptr--) = b; *(ptr--) = g; *(ptr--) = r; } /* for (int index = 0; index < size; index++) { *ptr = ((unsigned char *)pTemp)[index]; ptr--; if (index % 3 == 2) { *ptr = 0; ptr--; } } */ delete[] pData; } void ARFrameGrabber::BindFilter(int deviceId, IBaseFilter **pFilter) { if (deviceId < 0) return; // enumerate all video capture devices CComPtr<ICreateDevEnum> pCreateDevEnum; HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (void**)&pCreateDevEnum); if (hr != NOERROR) { // ErrMsg("Error Creating Device Enumerator"); return; } CComPtr<IEnumMoniker> pEm; hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEm, 0); if (hr != NOERROR) { // ErrMsg("Sorry, you have no video capture hardware"); return; } pEm->Reset(); ULONG cFetched; IMoniker *pM; int index = 0; while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK, index <= deviceId) { IPropertyBag *pBag; hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag); if(SUCCEEDED(hr)) { VARIANT var; var.vt = VT_BSTR; hr = pBag->Read(L"FriendlyName", &var, NULL); if (hr == NOERROR) { if (index == deviceId) { pM->BindToObject(0, 0, IID_IBaseFilter, (void**)pFilter); } SysFreeString(var.bstrVal); } pBag->Release(); } pM->Release(); index++; } } void ARFrameGrabber::DisplayProperties() { CComPtr<ISpecifyPropertyPages> pPages; HRESULT hr = pCameraOutput->QueryInterface(IID_ISpecifyPropertyPages, (void**)&pPages); if (SUCCEEDED(hr)) { PIN_INFO PinInfo; pCameraOutput->QueryPinInfo(&PinInfo); CAUUID caGUID; pPages->GetPages(&caGUID); OleCreatePropertyFrame( NULL, 0, 0, L"Property Sheet", 1, (IUnknown **)&(pCameraOutput.p), caGUID.cElems, caGUID.pElems, 0, 0, NULL); CoTaskMemFree(caGUID.pElems); PinInfo.pFilter->Release(); } } void ARFrameGrabber::EnumDevices(DeviceInfo *head) { if (!head) return; DeviceInfo *ptr = head; int id = 0; // enumerate all video capture devices CComPtr<ICreateDevEnum> pCreateDevEnum; // ICreateDevEnum *pCreateDevEnum; HRESULT hr = CoCreateInstance(CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (void**)&pCreateDevEnum); if (hr != NOERROR) { // ErrMsg("Error Creating Device Enumerator"); return; } CComPtr<IEnumMoniker> pEm; hr = pCreateDevEnum->CreateClassEnumerator(CLSID_VideoInputDeviceCategory, &pEm, 0); if (hr != NOERROR) { // ErrMsg("Sorry, you have no video capture hardware"); return; } pEm->Reset(); ULONG cFetched; IMoniker *pM; while(hr = pEm->Next(1, &pM, &cFetched), hr==S_OK) { IPropertyBag *pBag; hr = pM->BindToStorage(0, 0, IID_IPropertyBag, (void **)&pBag); if(SUCCEEDED(hr)) { VARIANT var; var.vt = VT_BSTR; hr = pBag->Read(L"FriendlyName", &var, NULL); if (hr == NOERROR) { char str[2048]; if (ptr->deviceId != -1) { ptr->next = new DeviceInfo(); ptr = ptr->next; } ptr->deviceId = id++; WideCharToMultiByte(CP_ACP,0,var.bstrVal, -1, str, 2048, NULL, NULL); ptr->friendlyName = str; SysFreeString(var.bstrVal); } pBag->Release(); } pM->Release(); } } void ARFrameGrabber::ReportError(char *msg) { MessageBox(NULL, msg, "ARFrameGrabber Error", MB_ICONSTOP); } --------------000503050009000709060907-- |