フォーラム: ヘルプ (スレッド #35402)

NyARToolkit for BeagleBoard-xm (2014-05-05 02:18 by johanna2 #72908)

Hello,

I'm working with the BeagleBoard-xm, I want to execute NyARToolkit, I already enable the USB camera, can you send me please the .apk version of NyARToolkit for Beagleboard?, or can you tell me how can I start to make an app using an USB camera.

Thank you so much for your help.

Re: NyARToolkit for BeagleBoard-xm (2014-05-05 04:18 by noritsuna #72910)

Hi,

I uploaded UVC Camera(USB Camera) Sample.
https://github.com/noritsuna/UVCCameraSampleForAndroid

UVCCameraPreview#run method gets a Bitmap Image from UVC Camera.
So if you merge this code and the CameraPreview class of NyARToolkit, you can use USB Camera on BeagleBoard-xM.

Regards,
Noritsuna


> Hello,
>
> I'm working with the BeagleBoard-xm, I want to execute NyARToolkit, I already enable the USB camera, can you send me please the .apk version of NyARToolkit for Beagleboard?, or can you tell me how can I start to make an app using an USB camera.
>
> Thank you so much for your help.
#72908 への返信

Re: NyARToolkit for BeagleBoard-xm (2014-05-13 23:27 by johanna2 #73002)

Hello,

Thank you so much for your help, I already download the sample UVC camera, I'm trying to run it on Android Studio but I got an error "program sh not found in PATH". On the other hand, I opened the project of NyARToolkit, and I guess I have to create a class for the webcam in jp.androidgroup.nyartoolkit.hardware, but I'm not sure if I'm right.

Do I have to do something else?, thanks I don't have much experience creating apps on Android.

Regards,

Johanna
#72910 への返信

Re: NyARToolkit for BeagleBoard-xm (2014-05-20 00:48 by johanna2 #73078)

Hello,

I'm still working on the BeagleBoard. I downloaded the NyARToolkitAndroid-2.5.2. When I opened the project I found the UVCCamera.java class, I prove it but the USB camera did not work. So I start to modifying this class like this:

UVCCamera.java class

package jp.androidgroup.nyartoolkit.hardware;

import java.io.*;

import jp.androidgroup.nyartoolkit.NyARToolkitAndroidActivity;
import jp.androidgroup.nyartoolkit.model.VoicePlayer;
import jp.androidgroup.nyartoolkit.view.GLSurfaceView;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.Dialog;
import android.app.ProgressDialog;
import android.content.Context;
import android.content.Intent;
import android.content.SharedPreferences;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Matrix;
import android.graphics.PixelFormat;
import android.graphics.Rect;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.PictureCallback;
import android.hardware.Camera.PreviewCallback;
import android.location.Location;
import android.location.LocationManager;
import android.location.LocationProvider;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.os.Handler;
import android.os.Message;
import android.preference.PreferenceManager;
import android.util.Config;
import android.util.Log;
import android.view.OrientationEventListener;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.Window;
import android.view.WindowManager;
import android.widget.ImageView;
import android.widget.Toast;

public class UVCCamera extends Activity implements CameraIF, SurfaceHolder.Callback, Runnable{

public static final String TAG = "UVCCamera";

private PreviewCallback mJpegPreviewCallback = null;

private android.hardware.Camera.Parameters mParameters;

// The parameter strings to communicate with camera driver.
public static final String PARM_PREVIEW_SIZE = "preview-size";
public static final String PARM_PICTURE_SIZE = "picture-size";
public static final String PARM_JPEG_QUALITY = "jpeg-quality";
public static final String PARM_ROTATION = "rotation";
public static final String PARM_GPS_LATITUDE = "gps-latitude";
public static final String PARM_GPS_LONGITUDE = "gps-longitude";
public static final String PARM_GPS_ALTITUDE = "gps-altitude";
public static final String PARM_GPS_TIMESTAMP = "gps-timestamp";
public static final String SUPPORTED_ZOOM = "zoom-values";
public static final String SUPPORTED_PICTURE_SIZE = "picture-size-values";

private SharedPreferences mPreferences;

public static final int IDLE = 1;
public static final int SNAPSHOT_IN_PROGRESS = 2;
public static final int SNAPSHOT_COMPLETED = 3;

private int mStatus = IDLE;

private android.hardware.Camera mCameraDevice;

private NyARToolkitAndroidActivity mMainActivity;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder = null;

private int mViewFinderWidth, mViewFinderHeight;

private ImageCapture mImageCapture = null;

private boolean mPreviewing;
private boolean mPausing;
private boolean mRecordLocation;

private static final int FOCUS_NOT_STARTED = 0;
private static final int FOCUSING = 1;
private static final int FOCUSING_SNAP_ON_FINISH = 2;
private static final int FOCUS_SUCCESS = 3;
private static final int FOCUS_FAIL = 4;
private int mFocusState = FOCUS_NOT_STARTED;

private LocationManager mLocationManager = null;

private final OneShotPreviewCallback mOneShotPreviewCallback = new OneShotPreviewCallback();
private final AutoFocusCallback mAutoFocusCallback = new AutoFocusCallback();

private String mFocusMode;

private Handler mHandler = null;

private MediaPlayer mVoiceSound = null;



//** Modificación de la clase **//

private Bitmap bmp=null;

private static final boolean DEBUG = true;

// This definition also exists in ImageProc.h.
// Webcam must support the resolution 640x480 with YUYV format.
static final int IMG_WIDTH=640;
static final int IMG_HEIGHT=480;

// The following variables are used to draw camera images.
private int winWidth=0;
private int winHeight=0;
private Rect rect;
private int dw, dh;
private float rate;

// /dev/videox (x=cameraId+cameraBase) is used.
// In some omap devices, system uses /dev/video[0-3],
// so users must use /dev/video[4-].
// In such a case, try cameraId=0 and cameraBase=4

private int cameraId=0;
private int cameraBase=0;

// JNI functions
public native int prepareCamera(int videoid);
public native int prepareCameraWithBase(int videoid, int camerabase);
public native void processCamera();
public native void stopCamera();
public native void pixeltobmp(Bitmap bitmap);
static {
System.loadLibrary("ImageProc");
}

private boolean cameraExists=false;
private boolean shouldStop=false;
Thread mainLoop = null;



private LocationListener [] mLocationListeners = new LocationListener[] {
new LocationListener(LocationManager.GPS_PROVIDER),
new LocationListener(LocationManager.NETWORK_PROVIDER)
};


public UVCCamera(NyARToolkitAndroidActivity mMainActivity, SurfaceView mSurfaceView) {
Log.d(TAG, "instance");

this.mMainActivity = mMainActivity;
mPreferences = PreferenceManager.getDefaultSharedPreferences(mMainActivity);
mLocationManager = (LocationManager) mMainActivity.getSystemService(Context.LOCATION_SERVICE);

this.mSurfaceView = mSurfaceView;
SurfaceHolder holder = mSurfaceView.getHolder();
holder.addCallback(mMainActivity);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

mHandler = mMainActivity.getMessageHandler();
}

public void setPreviewCallback(PreviewCallback callback) {
mJpegPreviewCallback = callback;
}

public void setParameters(Parameters params) {
mCameraDevice.setParameters(params);
}

public Parameters getParameters() {
return mCameraDevice.getParameters();
}

public void resetPreviewSize(int width, int height) {
}

/**
* This Handler is used to post message back onto the main thread of the
* application
*/
public void handleMessage(Message msg) {
switch (msg.what) {
case NyARToolkitAndroidActivity.RESTART_PREVIEW: {
if (mStatus == SNAPSHOT_IN_PROGRESS) {
// We are still in the processing of taking the picture, wait.
// This is is strange. Why are we polling?
// TODO remove polling
Log.d(TAG, "sendEmptyMessageDelayed(RESTART_PREVIEW)");
mHandler.sendEmptyMessageDelayed(NyARToolkitAndroidActivity.RESTART_PREVIEW, 100);
}
else
restartPreview();
break;
}

case NyARToolkitAndroidActivity.SHOW_LOADING: {
stopPreview();
break;
}

case NyARToolkitAndroidActivity.HIDE_LOADING: {
startPreview();
break;
}
}
}

private class LocationListener
implements android.location.LocationListener {
Location mLastLocation;
boolean mValid = false;
String mProvider;

public LocationListener(String provider) {
mProvider = provider;
mLastLocation = new Location(mProvider);
}

public void onLocationChanged(Location newLocation) {
if (newLocation.getLatitude() == 0.0 && newLocation.getLongitude() == 0.0) {
// Hack to filter out 0.0,0.0 locations
return;
}
mLastLocation.set(newLocation);
mValid = true;
}

public void onProviderEnabled(String provider) {
}

public void onProviderDisabled(String provider) {
mValid = false;
}

public void onStatusChanged(String provider, int status, Bundle extras) {
if (status == LocationProvider.OUT_OF_SERVICE) {
mValid = false;
}
}

public Location current() {
return mValid ? mLastLocation : null;
}
};

private final class OneShotPreviewCallback
implements android.hardware.Camera.PreviewCallback {

@Override
public void onPreviewFrame(byte[] data,
android.hardware.Camera camera) {
Log.d(TAG, "OneShotPreviewCallback.onPreviewFrame");

if(data != null) {
Log.d(TAG, "data exist");

autoFocus();
}
}

};

private final class AutoFocusCallback
implements android.hardware.Camera.AutoFocusCallback {
public void onAutoFocus(
boolean focused, android.hardware.Camera camera) {
Log.d(TAG, "AutoFocusCallback.onAutoFocus");

if (mFocusState == FOCUSING_SNAP_ON_FINISH) {
if (focused) {
mFocusState = FOCUS_SUCCESS;
} else {
mFocusState = FOCUS_FAIL;
}
mImageCapture.onSnap();
} else if (mFocusState == FOCUSING) {
if (focused) {
mFocusState = FOCUS_SUCCESS;
} else {
mFocusState = FOCUS_FAIL;
}
mImageCapture.onSnap();
} else if (mFocusState == FOCUS_NOT_STARTED) {
}
}
};

private final class JpegPictureCallback implements PictureCallback {
Location mLocation;

public JpegPictureCallback(Location loc) {
mLocation = loc;
}

public void onPictureTaken(
byte [] jpegData, android.hardware.Camera camera) {
if (mPausing) {
return;
}

Log.d(TAG, "JpegPictureCallback.onPictureTaken");

mStatus = SNAPSHOT_COMPLETED;

stopPreview();
restartPreview();
if(jpegData != null) {
Log.d(TAG, "jpegData exist");
mJpegPreviewCallback.onPreviewFrame(jpegData, null);
}
}
};

public class ImageCapture {

private boolean mCancel = false;

/*
* Initiate the capture of an image.
*/
public void initiate() {
if (mCameraDevice == null) {
return;
}

mCancel = true;

capture();
}

private void capture() {

while (true && cameraExists) {
//obtaining display area to draw a large image
if(winWidth==0){
winWidth=this.getWidth();
winHeight=this.getHeight();

if(winWidth*3/4<=winHeight){
dw = 0;
dh = (winHeight-winWidth*3/4)/2;
rate = ((float)winWidth)/IMG_WIDTH;
rect = new Rect(dw,dh,dw+winWidth-1,dh+winWidth*3/4-1);
}else{
dw = (winWidth-winHeight*4/3)/2;
dh = 0;
rate = ((float)winHeight)/IMG_HEIGHT;
rect = new Rect(dw,dh,dw+winHeight*4/3 -1,dh+winHeight-1);
}
}

// obtaining a camera image (pixel data are stored in an array in JNI).
processCamera();
// camera image to bmp
pixeltobmp(bmp);

Canvas canvas = getHolder().lockCanvas();
if (canvas != null)
{
// draw camera bmp on canvas
canvas.drawBitmap(bmp,null,rect,null);

getHolder().unlockCanvasAndPost(canvas);
}

if(shouldStop){
shouldStop = false;
break;
}
}
mCameraDevice.setParameters(mParameters);
// mCameraDevice.takePicture(null, null, new JpegPictureCallback(loc));
mPreviewing = false;
}

private SurfaceHolder getHolder() {
// TODO Auto-generated method stub
return null;
}

private int getHeight() {
// TODO Auto-generated method stub
return 0;
}

private int getWidth() {
// TODO Auto-generated method stub
return 0;
}

public void onSnap() {
// If we are already in the middle of taking a snapshot then ignore.
if (mPausing || mStatus == SNAPSHOT_IN_PROGRESS) {
return;
}

mStatus = SNAPSHOT_IN_PROGRESS;

mImageCapture.initiate();
}
}

@Override
public void onStart() {
Log.d(TAG, "onStart");
}

@Override
public void onDestroy() {
Log.d(TAG, "onDestroy");
}

@Override
public void onResume() {
Log.d(TAG, "onResume");

mPausing = false;
mImageCapture = new ImageCapture();
}

@Override
public void onStop() {
Log.d(TAG, "onStop");
}

@Override
public void onPause() {
Log.d(TAG, "onPause");

mPausing = true;
stopPreview();

closeCamera();

stopReceivingLocationUpdates();

mImageCapture = null;

mHandler.removeMessages(NyARToolkitAndroidActivity.CLEAR_SCREEN_DELAY);
mHandler.removeMessages(NyARToolkitAndroidActivity.RESTART_PREVIEW);
mHandler.removeMessages(NyARToolkitAndroidActivity.FIRST_TIME_INIT);
mHandler.removeMessages(NyARToolkitAndroidActivity.SHOW_LOADING);
mHandler.removeMessages(NyARToolkitAndroidActivity.HIDE_LOADING);

if (mVoiceSound != null) {
mVoiceSound.release();
mVoiceSound = null;
}
}

private boolean canTakePicture() {
return isCameraIdle() && mPreviewing;
}

private void autoFocus() {
if (canTakePicture()) {
Log.v(TAG, "Start autofocus. ");
mFocusState = FOCUSING;
mCameraDevice.autoFocus(mAutoFocusCallback);
}
}

private void clearFocusState() {
mFocusState = FOCUS_NOT_STARTED;
}

public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
Log.d(TAG, "surfaceChanged");
}

public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated");
if(DEBUG) Log.d(TAG, "surfaceCreated");
if(bmp==null){
bmp = Bitmap.createBitmap(mViewFinderWidth, mViewFinderHeight, Bitmap.Config.ARGB_8888);
}
// /dev/videox (x=cameraId + cameraBase) is used
int ret = prepareCameraWithBase(cameraId, cameraBase);

if(ret!=-1) cameraExists = true;

mainLoop = new Thread((Runnable) this);
mainLoop.start();

}

public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "surfaceDestroyed");

if(DEBUG) Log.d(TAG, "surfaceDestroyed");
if(cameraExists){
shouldStop = true;
while(shouldStop){
try{
Thread.sleep(100); // wait for thread stopping
}catch(Exception e){}
}
}
stopCamera();
mSurfaceHolder = null;
}

private void closeCamera() {
if (mCameraDevice != null) {
stopPreview();
mCameraDevice.release();
mCameraDevice = null;
mPreviewing = false;
}
}

private boolean ensureCameraDevice() {
if (mCameraDevice == null) {
Log.d(TAG, "ensureCameraDevice");
mCameraDevice = android.hardware.Camera.open();
}
return mCameraDevice != null;
}

public void restartPreview() {
Log.d(TAG, "restartPreview");
// make sure the surfaceview fills the whole screen when previewing
mSurfaceView.requestLayout();
mSurfaceView.invalidate();
startPreview();
}

private void setPreviewDisplay(SurfaceHolder holder) {
try {
mCameraDevice.setPreviewDisplay(holder);
} catch (Throwable ex) {
closeCamera();
throw new RuntimeException("setPreviewDisplay failed", ex);
}
}

@SuppressLint("NewApi")
private void startPreview() {
if (mPausing || mMainActivity.isFinishing()) return;

ensureCameraDevice();

// If we're previewing already, stop the preview first (this will blank
// the screen).
if (mPreviewing) stopPreview();

setPreviewDisplay(mSurfaceHolder);
setCameraParameters();

mCameraDevice.setOneShotPreviewCallback(mOneShotPreviewCallback);

try {
Log.v(TAG, "startPreview");
mCameraDevice.startPreview();
} catch (Throwable ex) {
closeCamera();
throw new RuntimeException("startPreview failed", ex);
}
mPreviewing = true;
mStatus = IDLE;
}

private void stopPreview() {
if (mCameraDevice != null && mPreviewing) {
Log.v(TAG, "stopPreview");
mCameraDevice.stopPreview();
}
mPreviewing = false;
clearFocusState();
}

private void setCameraParameters() {
mParameters = mCameraDevice.getParameters();
// mParameters.setPreviewSize(mViewFinderWidth, mViewFinderHeight);
mParameters.setPreviewSize(352, 288);

// 352x288 only
String previewSize = "352x288";
mParameters.set(PARM_PREVIEW_SIZE, previewSize);

// 352x288 only, since 640x480 is too big for bitmap
String pictureSize = "352x288";
mParameters.set(PARM_PICTURE_SIZE, pictureSize);

// 85 only
String jpegQuality = "85";
mParameters.set(PARM_JPEG_QUALITY, jpegQuality);

mCameraDevice.setParameters(mParameters);
}

private void startReceivingLocationUpdates() {
if (mLocationManager != null) {
try {
mLocationManager.requestLocationUpdates(
LocationManager.NETWORK_PROVIDER,
1000,
0F,
mLocationListeners[1]);
} catch (java.lang.SecurityException ex) {
Log.d(TAG, "fail to request location update, ignore", ex);
} catch (IllegalArgumentException ex) {
Log.d(TAG, "provider does not exist " + ex.getMessage());
}
try {
mLocationManager.requestLocationUpdates(
LocationManager.GPS_PROVIDER,
1000,
0F,
mLocationListeners[0]);
} catch (java.lang.SecurityException ex) {
Log.d(TAG, "fail to request location update, ignore", ex);
} catch (IllegalArgumentException ex) {
Log.d(TAG, "provider does not exist " + ex.getMessage());
}
}
}

private void stopReceivingLocationUpdates() {
if (mLocationManager != null) {
for (int i = 0; i < mLocationListeners.length; i++) {
try {
mLocationManager.removeUpdates(mLocationListeners[i]);
} catch (Exception ex) {
Log.d(TAG, "fail to remove location listners, ignore", ex);
}
}
}
}

public Location getCurrentLocation() {
// go in worst to best order
for (int i = 0; i < mLocationListeners.length; i++) {
Location l = mLocationListeners[i].current();
if (l != null) return l;
}
return null;
}

private boolean isCameraIdle() {
return mStatus == IDLE && mFocusState == FOCUS_NOT_STARTED;
//return mStatus == IDLE;
}
public void run() {
// TODO Auto-generated method stub

}
}

I have an error in NyARToolkitAndroidActivity class in:

} else if(getString(R.string.camera_name).equals("jp.androidgroup.nyartoolkit.hardware.UVCCamera")) {
isUseSerface = true;

if (mTranslucentBackground) {
mGLSurfaceView = new GLSurfaceView(this);
mGLSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);

SurfaceView mSurfaceView = new SurfaceView(this);
mCameraDevice = new UVCCamera(this, mSurfaceView);
mPreferences = PreferenceManager.getDefaultSharedPreferences(this);

setContentView(mGLSurfaceView);
addContentView(mSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
} else {
setContentView(R.layout.uvccamera);
SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.UVC_camera_preview);
//here is the problem mCameraDevice = new UVCCamera(this, mSurfaceView);
mPreferences = PreferenceManager.getDefaultSharedPreferences(this);

mGLSurfaceView = (GLSurfaceView) findViewById(R.id.UVC_GL_view);
mGLSurfaceView.setRenderer(mRenderer);
}

I will aprieciate a lot your help. Thanks.
#72910 への返信

Re: NyARToolkit for BeagleBoard-xm (2014-05-20 01:57 by noritsuna #73079)

Hi,

What's the error message?
And Why do you use "private android.hardware.Camera mCameraDevice"?

Regards,
Noritsuna

> Hello,
>
> I'm still working on the BeagleBoard. I downloaded the NyARToolkitAndroid-2.5.2. When I opened the project I found the UVCCamera.java class, I prove it but the USB camera did not work. So I start to modifying this class like this:
>
> UVCCamera.java class
>
> package jp.androidgroup.nyartoolkit.hardware;
>
> import java.io.*;
>
> import jp.androidgroup.nyartoolkit.NyARToolkitAndroidActivity;
> import jp.androidgroup.nyartoolkit.model.VoicePlayer;
> import jp.androidgroup.nyartoolkit.view.GLSurfaceView;
> import android.annotation.SuppressLint;
> import android.app.Activity;
> import android.app.Dialog;
> import android.app.ProgressDialog;
> import android.content.Context;
> import android.content.Intent;
> import android.content.SharedPreferences;
> import android.graphics.Bitmap;
> import android.graphics.Canvas;
> import android.graphics.Matrix;
> import android.graphics.PixelFormat;
> import android.graphics.Rect;
> import android.hardware.Camera.Parameters;
> import android.hardware.Camera.PictureCallback;
> import android.hardware.Camera.PreviewCallback;
> import android.location.Location;
> import android.location.LocationManager;
> import android.location.LocationProvider;
> import android.media.MediaPlayer;
> import android.os.Bundle;
> import android.os.Handler;
> import android.os.Message;
> import android.preference.PreferenceManager;
> import android.util.Config;
> import android.util.Log;
> import android.view.OrientationEventListener;
> import android.view.SurfaceHolder;
> import android.view.SurfaceView;
> import android.view.View;
> import android.view.ViewGroup;
> import android.view.Window;
> import android.view.WindowManager;
> import android.widget.ImageView;
> import android.widget.Toast;
>
> public class UVCCamera extends Activity implements CameraIF, SurfaceHolder.Callback, Runnable{
>
> public static final String TAG = "UVCCamera";
>
> private PreviewCallback mJpegPreviewCallback = null;
>
> private android.hardware.Camera.Parameters mParameters;
>
> // The parameter strings to communicate with camera driver.
> public static final String PARM_PREVIEW_SIZE = "preview-size";
> public static final String PARM_PICTURE_SIZE = "picture-size";
> public static final String PARM_JPEG_QUALITY = "jpeg-quality";
> public static final String PARM_ROTATION = "rotation";
> public static final String PARM_GPS_LATITUDE = "gps-latitude";
> public static final String PARM_GPS_LONGITUDE = "gps-longitude";
> public static final String PARM_GPS_ALTITUDE = "gps-altitude";
> public static final String PARM_GPS_TIMESTAMP = "gps-timestamp";
> public static final String SUPPORTED_ZOOM = "zoom-values";
> public static final String SUPPORTED_PICTURE_SIZE = "picture-size-values";
>
> private SharedPreferences mPreferences;
>
> public static final int IDLE = 1;
> public static final int SNAPSHOT_IN_PROGRESS = 2;
> public static final int SNAPSHOT_COMPLETED = 3;
>
> private int mStatus = IDLE;
>
> private android.hardware.Camera mCameraDevice;
>
> private NyARToolkitAndroidActivity mMainActivity;
> private SurfaceView mSurfaceView;
> private SurfaceHolder mSurfaceHolder = null;
>
> private int mViewFinderWidth, mViewFinderHeight;
>
> private ImageCapture mImageCapture = null;
>
> private boolean mPreviewing;
> private boolean mPausing;
> private boolean mRecordLocation;
>
> private static final int FOCUS_NOT_STARTED = 0;
> private static final int FOCUSING = 1;
> private static final int FOCUSING_SNAP_ON_FINISH = 2;
> private static final int FOCUS_SUCCESS = 3;
> private static final int FOCUS_FAIL = 4;
> private int mFocusState = FOCUS_NOT_STARTED;
>
> private LocationManager mLocationManager = null;
>
> private final OneShotPreviewCallback mOneShotPreviewCallback = new OneShotPreviewCallback();
> private final AutoFocusCallback mAutoFocusCallback = new AutoFocusCallback();
>
> private String mFocusMode;
>
> private Handler mHandler = null;
>
> private MediaPlayer mVoiceSound = null;
>
>
>
> //** Modificación de la clase **//
>
> private Bitmap bmp=null;
>
> private static final boolean DEBUG = true;
>
> // This definition also exists in ImageProc.h.
> // Webcam must support the resolution 640x480 with YUYV format.
> static final int IMG_WIDTH=640;
> static final int IMG_HEIGHT=480;
>
> // The following variables are used to draw camera images.
> private int winWidth=0;
> private int winHeight=0;
> private Rect rect;
> private int dw, dh;
> private float rate;
>
> // /dev/videox (x=cameraId+cameraBase) is used.
> // In some omap devices, system uses /dev/video[0-3],
> // so users must use /dev/video[4-].
> // In such a case, try cameraId=0 and cameraBase=4
>
> private int cameraId=0;
> private int cameraBase=0;
>
> // JNI functions
> public native int prepareCamera(int videoid);
> public native int prepareCameraWithBase(int videoid, int camerabase);
> public native void processCamera();
> public native void stopCamera();
> public native void pixeltobmp(Bitmap bitmap);
> static {
> System.loadLibrary("ImageProc");
> }
>
> private boolean cameraExists=false;
> private boolean shouldStop=false;
> Thread mainLoop = null;
>
>
>
> private LocationListener [] mLocationListeners = new LocationListener[] {
> new LocationListener(LocationManager.GPS_PROVIDER),
> new LocationListener(LocationManager.NETWORK_PROVIDER)
> };
>
>
> public UVCCamera(NyARToolkitAndroidActivity mMainActivity, SurfaceView mSurfaceView) {
> Log.d(TAG, "instance");
>
> this.mMainActivity = mMainActivity;
> mPreferences = PreferenceManager.getDefaultSharedPreferences(mMainActivity);
> mLocationManager = (LocationManager) mMainActivity.getSystemService(Context.LOCATION_SERVICE);
>
> this.mSurfaceView = mSurfaceView;
> SurfaceHolder holder = mSurfaceView.getHolder();
> holder.addCallback(mMainActivity);
> holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
>
> mHandler = mMainActivity.getMessageHandler();
> }
>
> public void setPreviewCallback(PreviewCallback callback) {
> mJpegPreviewCallback = callback;
> }
>
> public void setParameters(Parameters params) {
> mCameraDevice.setParameters(params);
> }
>
> public Parameters getParameters() {
> return mCameraDevice.getParameters();
> }
>
> public void resetPreviewSize(int width, int height) {
> }
>
> /**
> * This Handler is used to post message back onto the main thread of the
> * application
> */
> public void handleMessage(Message msg) {
> switch (msg.what) {
> case NyARToolkitAndroidActivity.RESTART_PREVIEW: {
> if (mStatus == SNAPSHOT_IN_PROGRESS) {
> // We are still in the processing of taking the picture, wait.
> // This is is strange. Why are we polling?
> // TODO remove polling
> Log.d(TAG, "sendEmptyMessageDelayed(RESTART_PREVIEW)");
> mHandler.sendEmptyMessageDelayed(NyARToolkitAndroidActivity.RESTART_PREVIEW, 100);
> }
> else
> restartPreview();
> break;
> }
>
> case NyARToolkitAndroidActivity.SHOW_LOADING: {
> stopPreview();
> break;
> }
>
> case NyARToolkitAndroidActivity.HIDE_LOADING: {
> startPreview();
> break;
> }
> }
> }
>
> private class LocationListener
> implements android.location.LocationListener {
> Location mLastLocation;
> boolean mValid = false;
> String mProvider;
>
> public LocationListener(String provider) {
> mProvider = provider;
> mLastLocation = new Location(mProvider);
> }
>
> public void onLocationChanged(Location newLocation) {
> if (newLocation.getLatitude() == 0.0 && newLocation.getLongitude() == 0.0) {
> // Hack to filter out 0.0,0.0 locations
> return;
> }
> mLastLocation.set(newLocation);
> mValid = true;
> }
>
> public void onProviderEnabled(String provider) {
> }
>
> public void onProviderDisabled(String provider) {
> mValid = false;
> }
>
> public void onStatusChanged(String provider, int status, Bundle extras) {
> if (status == LocationProvider.OUT_OF_SERVICE) {
> mValid = false;
> }
> }
>
> public Location current() {
> return mValid ? mLastLocation : null;
> }
> };
>
> private final class OneShotPreviewCallback
> implements android.hardware.Camera.PreviewCallback {
>
> @Override
> public void onPreviewFrame(byte[] data,
> android.hardware.Camera camera) {
> Log.d(TAG, "OneShotPreviewCallback.onPreviewFrame");
>
> if(data != null) {
> Log.d(TAG, "data exist");
>
> autoFocus();
> }
> }
>
> };
>
> private final class AutoFocusCallback
> implements android.hardware.Camera.AutoFocusCallback {
> public void onAutoFocus(
> boolean focused, android.hardware.Camera camera) {
> Log.d(TAG, "AutoFocusCallback.onAutoFocus");
>
> if (mFocusState == FOCUSING_SNAP_ON_FINISH) {
> if (focused) {
> mFocusState = FOCUS_SUCCESS;
> } else {
> mFocusState = FOCUS_FAIL;
> }
> mImageCapture.onSnap();
> } else if (mFocusState == FOCUSING) {
> if (focused) {
> mFocusState = FOCUS_SUCCESS;
> } else {
> mFocusState = FOCUS_FAIL;
> }
> mImageCapture.onSnap();
> } else if (mFocusState == FOCUS_NOT_STARTED) {
> }
> }
> };
>
> private final class JpegPictureCallback implements PictureCallback {
> Location mLocation;
>
> public JpegPictureCallback(Location loc) {
> mLocation = loc;
> }
>
> public void onPictureTaken(
> byte [] jpegData, android.hardware.Camera camera) {
> if (mPausing) {
> return;
> }
>
> Log.d(TAG, "JpegPictureCallback.onPictureTaken");
>
> mStatus = SNAPSHOT_COMPLETED;
>
> stopPreview();
> restartPreview();
> if(jpegData != null) {
> Log.d(TAG, "jpegData exist");
> mJpegPreviewCallback.onPreviewFrame(jpegData, null);
> }
> }
> };
>
> public class ImageCapture {
>
> private boolean mCancel = false;
>
> /*
> * Initiate the capture of an image.
> */
> public void initiate() {
> if (mCameraDevice == null) {
> return;
> }
>
> mCancel = true;
>
> capture();
> }
>
> private void capture() {
>
> while (true && cameraExists) {
> //obtaining display area to draw a large image
> if(winWidth==0){
> winWidth=this.getWidth();
> winHeight=this.getHeight();
>
> if(winWidth*3/4<=winHeight){
> dw = 0;
> dh = (winHeight-winWidth*3/4)/2;
> rate = ((float)winWidth)/IMG_WIDTH;
> rect = new Rect(dw,dh,dw+winWidth-1,dh+winWidth*3/4-1);
> }else{
> dw = (winWidth-winHeight*4/3)/2;
> dh = 0;
> rate = ((float)winHeight)/IMG_HEIGHT;
> rect = new Rect(dw,dh,dw+winHeight*4/3 -1,dh+winHeight-1);
> }
> }
>
> // obtaining a camera image (pixel data are stored in an array in JNI).
> processCamera();
> // camera image to bmp
> pixeltobmp(bmp);
>
> Canvas canvas = getHolder().lockCanvas();
> if (canvas != null)
> {
> // draw camera bmp on canvas
> canvas.drawBitmap(bmp,null,rect,null);
>
> getHolder().unlockCanvasAndPost(canvas);
> }
>
> if(shouldStop){
> shouldStop = false;
> break;
> }
> }
> mCameraDevice.setParameters(mParameters);
> // mCameraDevice.takePicture(null, null, new JpegPictureCallback(loc));
> mPreviewing = false;
> }
>
> private SurfaceHolder getHolder() {
> // TODO Auto-generated method stub
> return null;
> }
>
> private int getHeight() {
> // TODO Auto-generated method stub
> return 0;
> }
>
> private int getWidth() {
> // TODO Auto-generated method stub
> return 0;
> }
>
> public void onSnap() {
> // If we are already in the middle of taking a snapshot then ignore.
> if (mPausing || mStatus == SNAPSHOT_IN_PROGRESS) {
> return;
> }
>
> mStatus = SNAPSHOT_IN_PROGRESS;
>
> mImageCapture.initiate();
> }
> }
>
> @Override
> public void onStart() {
> Log.d(TAG, "onStart");
> }
>
> @Override
> public void onDestroy() {
> Log.d(TAG, "onDestroy");
> }
>
> @Override
> public void onResume() {
> Log.d(TAG, "onResume");
>
> mPausing = false;
> mImageCapture = new ImageCapture();
> }
>
> @Override
> public void onStop() {
> Log.d(TAG, "onStop");
> }
>
> @Override
> public void onPause() {
> Log.d(TAG, "onPause");
>
> mPausing = true;
> stopPreview();
>
> closeCamera();
>
> stopReceivingLocationUpdates();
>
> mImageCapture = null;
>
> mHandler.removeMessages(NyARToolkitAndroidActivity.CLEAR_SCREEN_DELAY);
> mHandler.removeMessages(NyARToolkitAndroidActivity.RESTART_PREVIEW);
> mHandler.removeMessages(NyARToolkitAndroidActivity.FIRST_TIME_INIT);
> mHandler.removeMessages(NyARToolkitAndroidActivity.SHOW_LOADING);
> mHandler.removeMessages(NyARToolkitAndroidActivity.HIDE_LOADING);
>
> if (mVoiceSound != null) {
> mVoiceSound.release();
> mVoiceSound = null;
> }
> }
>
> private boolean canTakePicture() {
> return isCameraIdle() && mPreviewing;
> }
>
> private void autoFocus() {
> if (canTakePicture()) {
> Log.v(TAG, "Start autofocus. ");
> mFocusState = FOCUSING;
> mCameraDevice.autoFocus(mAutoFocusCallback);
> }
> }
>
> private void clearFocusState() {
> mFocusState = FOCUS_NOT_STARTED;
> }
>
> public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
> Log.d(TAG, "surfaceChanged");
> }
>
> public void surfaceCreated(SurfaceHolder holder) {
> Log.d(TAG, "surfaceCreated");
> if(DEBUG) Log.d(TAG, "surfaceCreated");
> if(bmp==null){
> bmp = Bitmap.createBitmap(mViewFinderWidth, mViewFinderHeight, Bitmap.Config.ARGB_8888);
> }
> // /dev/videox (x=cameraId + cameraBase) is used
> int ret = prepareCameraWithBase(cameraId, cameraBase);
>
> if(ret!=-1) cameraExists = true;
>
> mainLoop = new Thread((Runnable) this);
> mainLoop.start();
>
> }
>
> public void surfaceDestroyed(SurfaceHolder holder) {
> Log.d(TAG, "surfaceDestroyed");
>
> if(DEBUG) Log.d(TAG, "surfaceDestroyed");
> if(cameraExists){
> shouldStop = true;
> while(shouldStop){
> try{
> Thread.sleep(100); // wait for thread stopping
> }catch(Exception e){}
> }
> }
> stopCamera();
> mSurfaceHolder = null;
> }
>
> private void closeCamera() {
> if (mCameraDevice != null) {
> stopPreview();
> mCameraDevice.release();
> mCameraDevice = null;
> mPreviewing = false;
> }
> }
>
> private boolean ensureCameraDevice() {
> if (mCameraDevice == null) {
> Log.d(TAG, "ensureCameraDevice");
> mCameraDevice = android.hardware.Camera.open();
> }
> return mCameraDevice != null;
> }
>
> public void restartPreview() {
> Log.d(TAG, "restartPreview");
> // make sure the surfaceview fills the whole screen when previewing
> mSurfaceView.requestLayout();
> mSurfaceView.invalidate();
> startPreview();
> }
>
> private void setPreviewDisplay(SurfaceHolder holder) {
> try {
> mCameraDevice.setPreviewDisplay(holder);
> } catch (Throwable ex) {
> closeCamera();
> throw new RuntimeException("setPreviewDisplay failed", ex);
> }
> }
>
> @SuppressLint("NewApi")
> private void startPreview() {
> if (mPausing || mMainActivity.isFinishing()) return;
>
> ensureCameraDevice();
>
> // If we're previewing already, stop the preview first (this will blank
> // the screen).
> if (mPreviewing) stopPreview();
>
> setPreviewDisplay(mSurfaceHolder);
> setCameraParameters();
>
> mCameraDevice.setOneShotPreviewCallback(mOneShotPreviewCallback);
>
> try {
> Log.v(TAG, "startPreview");
> mCameraDevice.startPreview();
> } catch (Throwable ex) {
> closeCamera();
> throw new RuntimeException("startPreview failed", ex);
> }
> mPreviewing = true;
> mStatus = IDLE;
> }
>
> private void stopPreview() {
> if (mCameraDevice != null && mPreviewing) {
> Log.v(TAG, "stopPreview");
> mCameraDevice.stopPreview();
> }
> mPreviewing = false;
> clearFocusState();
> }
>
> private void setCameraParameters() {
> mParameters = mCameraDevice.getParameters();
> // mParameters.setPreviewSize(mViewFinderWidth, mViewFinderHeight);
> mParameters.setPreviewSize(352, 288);
>
> // 352x288 only
> String previewSize = "352x288";
> mParameters.set(PARM_PREVIEW_SIZE, previewSize);
>
> // 352x288 only, since 640x480 is too big for bitmap
> String pictureSize = "352x288";
> mParameters.set(PARM_PICTURE_SIZE, pictureSize);
>
> // 85 only
> String jpegQuality = "85";
> mParameters.set(PARM_JPEG_QUALITY, jpegQuality);
>
> mCameraDevice.setParameters(mParameters);
> }
>
> private void startReceivingLocationUpdates() {
> if (mLocationManager != null) {
> try {
> mLocationManager.requestLocationUpdates(
> LocationManager.NETWORK_PROVIDER,
> 1000,
> 0F,
> mLocationListeners[1]);
> } catch (java.lang.SecurityException ex) {
> Log.d(TAG, "fail to request location update, ignore", ex);
> } catch (IllegalArgumentException ex) {
> Log.d(TAG, "provider does not exist " + ex.getMessage());
> }
> try {
> mLocationManager.requestLocationUpdates(
> LocationManager.GPS_PROVIDER,
> 1000,
> 0F,
> mLocationListeners[0]);
> } catch (java.lang.SecurityException ex) {
> Log.d(TAG, "fail to request location update, ignore", ex);
> } catch (IllegalArgumentException ex) {
> Log.d(TAG, "provider does not exist " + ex.getMessage());
> }
> }
> }
>
> private void stopReceivingLocationUpdates() {
> if (mLocationManager != null) {
> for (int i = 0; i < mLocationListeners.length; i++) {
> try {
> mLocationManager.removeUpdates(mLocationListeners[i]);
> } catch (Exception ex) {
> Log.d(TAG, "fail to remove location listners, ignore", ex);
> }
> }
> }
> }
>
> public Location getCurrentLocation() {
> // go in worst to best order
> for (int i = 0; i < mLocationListeners.length; i++) {
> Location l = mLocationListeners[i].current();
> if (l != null) return l;
> }
> return null;
> }
>
> private boolean isCameraIdle() {
> return mStatus == IDLE && mFocusState == FOCUS_NOT_STARTED;
> //return mStatus == IDLE;
> }
> public void run() {
> // TODO Auto-generated method stub
>
> }
> }
>
> I have an error in NyARToolkitAndroidActivity class in:
>
> } else if(getString(R.string.camera_name).equals("jp.androidgroup.nyartoolkit.hardware.UVCCamera")) {
> isUseSerface = true;
>
> if (mTranslucentBackground) {
> mGLSurfaceView = new GLSurfaceView(this);
> mGLSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
> mGLSurfaceView.setRenderer(mRenderer);
> mGLSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
>
> SurfaceView mSurfaceView = new SurfaceView(this);
> mCameraDevice = new UVCCamera(this, mSurfaceView);
> mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
>
> setContentView(mGLSurfaceView);
> addContentView(mSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
> } else {
> setContentView(R.layout.uvccamera);
> SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.UVC_camera_preview);
> //here is the problem mCameraDevice = new UVCCamera(this, mSurfaceView);
> mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
>
> mGLSurfaceView = (GLSurfaceView) findViewById(R.id.UVC_GL_view);
> mGLSurfaceView.setRenderer(mRenderer);
> }
>
> I will aprieciate a lot your help. Thanks.
#73078 への返信

Re: NyARToolkit for BeagleBoard-xm (2014-05-24 03:24 by johanna2 #73143)

[メッセージ #73079 への返信]
> Hi,
>
> What's the error message?
> And Why do you use "private android.hardware.Camera mCameraDevice"?
>
> Regards,
> Noritsuna
>
Hi,

Thanks for your answer, I want to explain better the situation. I opened NyARToolkitAndroid-2.5.2 project, there is a package with the following classes:

jp.androidgroup.nyartoolkit.hardware -------> Package

*CameraIF.java
*Dev1Camera.java
*HT03ACamera.java
*N1Camera.java
*SocketCamera.java
*StaticCamera.java
*UVCCamera.java

I noticed that CameraIF.java class is an interface for cameras.

In other package --------> jp.androidgroup.nyartoolkit, the class NyARToolkitAndroidActivity.java is the main class that choose the camera class in this line:

// init Camera.

****** In case we are using UVCCamera class for example********

else if(getString(R.string.camera_name).equals("jp.androidgroup.nyartoolkit.hardware.UVCCamera")) {

isUseSerface = true;
if (mTranslucentBackground) {
mGLSurfaceView = new GLSurfaceView(this);
mGLSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
SurfaceView mSurfaceView = new SurfaceView(this);
mCameraDevice = new UVCCamera(this, mSurfaceView);
mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
setContentView(mGLSurfaceView);
addContentView(mSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
} else {
setContentView(R.layout.uvccamera);
SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.UVC_camera_preview);
mCameraDevice = new UVCCamera(this, mSurfaceView);
mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
mGLSurfaceView = (GLSurfaceView) findViewById(R.id.UVC_GL_view);
mGLSurfaceView.setRenderer(mRenderer);

***********************************************************************

So, I decided to modify this class UVCCamera.java that was included originally in this package of the NyARToolkit, in this class they are using:

private android.hardware.Camera mCameraDevice

I added the following code to this class UVCCamera based on the example UVCCamerapreview you sent me:

*************Here I added run method*****************

But I'm not sure about mcameradevice, I don't know how to merge this run method in this case, but I dit it like this:

public class ImageCapture {

private boolean mCancel = false;

/*
* Initiate the capture of an image.
*/
public void initiate() {
if (mCameraDevice == null) {
return;
}

mCancel = true;

capture();
}

private void capture() {

while (true && cameraExists) {
//obtaining display area to draw a large image
if(winWidth==0){
winWidth=this.getWidth();
winHeight=this.getHeight();

if(winWidth*3/4<=winHeight){
dw = 0;
dh = (winHeight-winWidth*3/4)/2;
rate = ((float)winWidth)/IMG_WIDTH;
rect = new Rect(dw,dh,dw+winWidth-1,dh+winWidth*3/4-1);
}else{
dw = (winWidth-winHeight*4/3)/2;
dh = 0;
rate = ((float)winHeight)/IMG_HEIGHT;
rect = new Rect(dw,dh,dw+winHeight*4/3 -1,dh+winHeight-1);
}
}

// obtaining a camera image (pixel data are stored in an array in JNI).
processCamera();
// camera image to bmp
pixeltobmp(bmp);

Canvas canvas = getHolder().lockCanvas();
if (canvas != null)
{
// draw camera bmp on canvas
canvas.drawBitmap(bmp,null,rect,null);

getHolder().unlockCanvasAndPost(canvas);
}

if(shouldStop){
shouldStop = false;
break;
}
}
mCameraDevice.setParameters(mParameters);
// mCameraDevice.takePicture(null, null, new JpegPictureCallback(loc));
mPreviewing = false;
}

***************************Then I modified******************************************************



public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated");
if(DEBUG) Log.d(TAG, "surfaceCreated");
if(bmp==null){
bmp = Bitmap.createBitmap(mViewFinderWidth, mViewFinderHeight, Bitmap.Config.ARGB_8888);
}
// /dev/videox (x=cameraId + cameraBase) is used
int ret = prepareCameraWithBase(cameraId, cameraBase);

if(ret!=-1) cameraExists = true;

mainLoop = new Thread((Runnable) this);
mainLoop.start();

}

*********************************And finally I modified this**********************************************************

public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "surfaceDestroyed");

if(DEBUG) Log.d(TAG, "surfaceDestroyed");
if(cameraExists){
shouldStop = true;
while(shouldStop){
try{
Thread.sleep(100); // wait for thread stopping
}catch(Exception e){}
}
}
stopCamera();
mSurfaceHolder = null;
}

*************************************************************************************
When I excecute the app, it says unfortunately the app has to stop, the problem is in this line of the class:


NyARToolkitAndroidActivity.java-------> Class

Line 198---->mCameraDevice = new UVCCamera(this, mSurfaceView);

In this class mCameraDevice is defined as:

private CameraIF mCameraDevice;


*****************************Conclusion**************************************************

Do you think is better to modify another class of the NyARToolkit?, for example the HT03ACamera class, maybe I have unnecessary information in UVCCameraclass, I want to make clear that UVCCamera class IS NOT the class you sent me is a class included in the original package, that I'm modifying. Can you explain me with more details how to merge the code of WebCamera with the NyARToolkit. Thanks a lot.

#73079 への返信

Re: NyARToolkit for BeagleBoard-xm (2014-05-24 03:28 by johanna2 #73144)

[メッセージ #73143 への返信]
> [メッセージ #73079 への返信]
> > Hi,
> >
> > What's the error message?
> > And Why do you use "private android.hardware.Camera mCameraDevice"?
> >
> > Regards,
> > Noritsuna
> >
> Hi,
>
> Thanks for your answer, I want to explain better the situation. I opened NyARToolkitAndroid-2.5.2 project, there is a package with the following classes:
>
> jp.androidgroup.nyartoolkit.hardware -------> Package
>
> *CameraIF.java
> *Dev1Camera.java
> *HT03ACamera.java
> *N1Camera.java
> *SocketCamera.java
> *StaticCamera.java
> *UVCCamera.java
>
> I noticed that CameraIF.java class is an interface for cameras.
>
> In other package --------> jp.androidgroup.nyartoolkit, the class NyARToolkitAndroidActivity.java is the main class that choose the camera class in this line:
>
> // init Camera.
>
> ****** In case we are using UVCCamera class for example********
>
> else if(getString(R.string.camera_name).equals("jp.androidgroup.nyartoolkit.hardware.UVCCamera")) {
>
> isUseSerface = true;
> if (mTranslucentBackground) {
> mGLSurfaceView = new GLSurfaceView(this);
> mGLSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
> mGLSurfaceView.setRenderer(mRenderer);
> mGLSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
> SurfaceView mSurfaceView = new SurfaceView(this);
> mCameraDevice = new UVCCamera(this, mSurfaceView);
> mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
> setContentView(mGLSurfaceView);
> addContentView(mSurfaceView, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
> } else {
> setContentView(R.layout.uvccamera);
> SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.UVC_camera_preview);
> mCameraDevice = new UVCCamera(this, mSurfaceView);
> mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
> mGLSurfaceView = (GLSurfaceView) findViewById(R.id.UVC_GL_view);
> mGLSurfaceView.setRenderer(mRenderer);
>
> ***********************************************************************
>
> So, I decided to modify this class UVCCamera.java that was included originally in this package of the NyARToolkit, in this class they are using:
>
> private android.hardware.Camera mCameraDevice
>
> I added the following code to this class UVCCamera based on the example UVCCamerapreview you sent me:
>
> *************Here I added run method*****************
>
> But I'm not sure about mcameradevice, I don't know how to merge this run method in this case, but I dit it like this:
>
> public class ImageCapture {
>
> private boolean mCancel = false;
>
> /*
> * Initiate the capture of an image.
> */
> public void initiate() {
> if (mCameraDevice == null) {
> return;
> }
>
> mCancel = true;
>
> capture();
> }
>
> private void capture() {
>
> while (true && cameraExists) {
> //obtaining display area to draw a large image
> if(winWidth==0){
> winWidth=this.getWidth();
> winHeight=this.getHeight();
>
> if(winWidth*3/4<=winHeight){
> dw = 0;
> dh = (winHeight-winWidth*3/4)/2;
> rate = ((float)winWidth)/IMG_WIDTH;
> rect = new Rect(dw,dh,dw+winWidth-1,dh+winWidth*3/4-1);
> }else{
> dw = (winWidth-winHeight*4/3)/2;
> dh = 0;
> rate = ((float)winHeight)/IMG_HEIGHT;
> rect = new Rect(dw,dh,dw+winHeight*4/3 -1,dh+winHeight-1);
> }
> }
>
> // obtaining a camera image (pixel data are stored in an array in JNI).
> processCamera();
> // camera image to bmp
> pixeltobmp(bmp);
>
> Canvas canvas = getHolder().lockCanvas();
> if (canvas != null)
> {
> // draw camera bmp on canvas
> canvas.drawBitmap(bmp,null,rect,null);
>
> getHolder().unlockCanvasAndPost(canvas);
> }
>
> if(shouldStop){
> shouldStop = false;
> break;
> }
> }
> mCameraDevice.setParameters(mParameters);
> // mCameraDevice.takePicture(null, null, new JpegPictureCallback(loc));
> mPreviewing = false;
> }
>
> ***************************Then I modified******************************************************
>
>
>
> public void surfaceCreated(SurfaceHolder holder) {
> Log.d(TAG, "surfaceCreated");
> if(DEBUG) Log.d(TAG, "surfaceCreated");
> if(bmp==null){
> bmp = Bitmap.createBitmap(mViewFinderWidth, mViewFinderHeight, Bitmap.Config.ARGB_8888);
> }
> // /dev/videox (x=cameraId + cameraBase) is used
> int ret = prepareCameraWithBase(cameraId, cameraBase);
>
> if(ret!=-1) cameraExists = true;
>
> mainLoop = new Thread((Runnable) this);
> mainLoop.start();
>
> }
>
> *********************************And finally I modified this**********************************************************
>
> public void surfaceDestroyed(SurfaceHolder holder) {
> Log.d(TAG, "surfaceDestroyed");
>
> if(DEBUG) Log.d(TAG, "surfaceDestroyed");
> if(cameraExists){
> shouldStop = true;
> while(shouldStop){
> try{
> Thread.sleep(100); // wait for thread stopping
> }catch(Exception e){}
> }
> }
> stopCamera();
> mSurfaceHolder = null;
> }
>
> *************************************************************************************
> When I excecute the app, it says unfortunately the app has to stop, the problem is in this line of the class:
>
>
> NyARToolkitAndroidActivity.java-------> Class
>
> Line 198---->mCameraDevice = new UVCCamera(this, mSurfaceView);
>
> In this class mCameraDevice is defined as:
>
> private CameraIF mCameraDevice;
>
>
> *****************************Conclusion**************************************************
>
> Do you think is better to modify another class of the NyARToolkit?, for example the HT03ACamera class, maybe I have unnecessary information in UVCCameraclass, I want to make clear that UVCCamera class IS NOT the class you sent me, is a class included in the original package and I'm modifying it. Can you explain me with more details how to merge the code of WebCamera with the NyARToolkit in this case ?. Thanks a lot.
>

Re: NyARToolkit for BeagleBoard-xm (2014-05-25 21:15 by johanna2 #73155)

Hi,

I write to you againg, beacuse I decided to create a new class called:

Cameraweb.java

*************************This is the class******************************************


package jp.androidgroup.nyartoolkit.hardware;

import android.app.Activity;
import android.content.Context;
import android.os.Message;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Rect;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.PreviewCallback;

public class Cameraweb extends SurfaceView implements CameraIF, SurfaceHolder.Callback, Runnable{

private static final boolean DEBUG = true;
private static final String TAG="WebCam";
protected Context context;
private SurfaceHolder holder;
Thread mainLoop = null;
private Bitmap bmp=null;

private boolean cameraExists=false;
private boolean shouldStop=false;

// /dev/videox (x=cameraId+cameraBase) is used.
// In some omap devices, system uses /dev/video[0-3],
// so users must use /dev/video[4-].
// In such a case, try cameraId=0 and cameraBase=4
private int cameraId=0;
private int cameraBase=0;

// This definition also exists in ImageProc.h.
// Webcam must support the resolution 640x480 with YUYV format.
static final int IMG_WIDTH=640;
static final int IMG_HEIGHT=480;

// The following variables are used to draw camera images.
private int winWidth=0;
private int winHeight=0;
private Rect rect;
private int dw, dh;
private float rate;

// JNI functions
public native int prepareCamera(int videoid);
public native int prepareCameraWithBase(int videoid, int camerabase);
public native void processCamera();
public native void stopCamera();
public native void pixeltobmp(Bitmap bitmap);
static {
System.loadLibrary("ImageProc");
}

public Cameraweb(Context context) {
super(context);
this.context = context;
if(DEBUG) Log.d(TAG,"CameraPreview constructed");
setFocusable(true);

holder = getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
}


public Cameraweb(Context context, SurfaceView mSurfaceView) {
super(context);
this.context = context;
if(DEBUG) Log.d(TAG,"CameraPreview constructed");
setFocusable(true);

holder = getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
}

@Override
public void run() {
while (true && cameraExists) {
//obtaining display area to draw a large image
if(winWidth==0){
winWidth=this.getWidth();
winHeight=this.getHeight();

if(winWidth*3/4<=winHeight){
dw = 0;
dh = (winHeight-winWidth*3/4)/2;
rate = ((float)winWidth)/IMG_WIDTH;
rect = new Rect(dw,dh,dw+winWidth-1,dh+winWidth*3/4-1);
}else{
dw = (winWidth-winHeight*4/3)/2;
dh = 0;
rate = ((float)winHeight)/IMG_HEIGHT;
rect = new Rect(dw,dh,dw+winHeight*4/3 -1,dh+winHeight-1);
}
}

// obtaining a camera image (pixel data are stored in an array in JNI).
processCamera();
// camera image to bmp
pixeltobmp(bmp);

Canvas canvas = getHolder().lockCanvas();
if (canvas != null)
{
// draw camera bmp on canvas
canvas.drawBitmap(bmp,null,rect,null);

getHolder().unlockCanvasAndPost(canvas);
}

if(shouldStop){
shouldStop = false;
break;
}
}
}

@Override
public void surfaceCreated(SurfaceHolder holder) {
if(DEBUG) Log.d(TAG, "surfaceCreated");
if(bmp==null){
bmp = Bitmap.createBitmap(IMG_WIDTH, IMG_HEIGHT, Bitmap.Config.ARGB_8888);
}
// /dev/videox (x=cameraId + cameraBase) is used
int ret = prepareCameraWithBase(cameraId, cameraBase);

if(ret!=-1) cameraExists = true;

mainLoop = new Thread(this);
mainLoop.start();
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if(DEBUG) Log.d(TAG, "surfaceChanged");
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if(DEBUG) Log.d(TAG, "surfaceDestroyed");
if(cameraExists){
shouldStop = true;
while(shouldStop){
try{
Thread.sleep(100); // wait for thread stopping
}catch(Exception e){}
}
}
stopCamera();
}
@Override
public void setPreviewCallback(PreviewCallback cb) {
// TODO Auto-generated method stub

}
@Override
public void setParameters(Parameters params) {
// TODO Auto-generated method stub

}
@Override
public Parameters getParameters() {
// TODO Auto-generated method stub
return null;
}
@Override
public void resetPreviewSize(int width, int height) {
// TODO Auto-generated method stub

}
@Override
public void onStart() {
// TODO Auto-generated method stub

}
@Override
public void onResume() {
// TODO Auto-generated method stub

}
@Override
public void onStop() {
// TODO Auto-generated method stub

}
@Override
public void onPause() {
// TODO Auto-generated method stub

}
@Override
public void onDestroy() {
// TODO Auto-generated method stub

}
@Override
public void handleMessage(Message msg) {
// TODO Auto-generated method stub

}
}

***************************************************************************************
Here, I just copy the UVCCamera class you sent me and I extended "SurfaceView", then I Implemented CameraIF, SurfaceHolder.Callback and Runnable.

I still don't know how to implement Cameraweb.java class methods or how to merge them with CameraIF methods which are:

******************************CameraIF.java class ******************************
/**
* It is an interface for cameras.
*
* @author noritsuna
*
*/
public interface CameraIF {

public void setPreviewCallback(PreviewCallback cb);

public void setParameters(Parameters params);
public Parameters getParameters();
public void resetPreviewSize(int width, int height);

public void onStart();
public void onResume();
public void onStop();
public void onPause();
public void onDestroy();


public void handleMessage(Message msg);

}

***************************NyARToolkitAndroidActivity******************************************

I have to do this because in NyARToolkitAndroidActivity.java class we have:

private CameraIF mCameraDevice;
private SurfaceHolder mSurfaceHolder = null;
private GLSurfaceView mGLSurfaceView;
private ModelRenderer mRenderer;

Then I added an else if conditional like this:

else if (getString(R.string.camera_name).equals("jp.androidgroup.nyartoolkit.hardware.Cameraweb")) {
isUseSerface = true;
setContentView(R.layout.cameraweb);
SurfaceView mSurfaceView = (SurfaceView) findViewById(R.id.Cameraweb_camera_preview);
mCameraDevice = new Cameraweb(this, mSurfaceView);
mPreferences = PreferenceManager.getDefaultSharedPreferences(this);
mGLSurfaceView = (GLSurfaceView) findViewById(R.id.Cameraweb_GL_view);
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.getHolder().setFormat(PixelFormat.RGBA_8888);
}

I added a layout for cameraweb, cameraweb.xml. I think this is wrong because it is not working and I have a problem in this line:

mCameraDevice=new Cameraweb(this, mSurfaceView);

*******************************************************************************************
I need more help, I'm stucked. Thank you so much.




#73079 への返信