Writing code that captures videos on Android
Although the Google guys did a good job on the Android documentation, the explanation on how to write code that captures videos is somewhat short.
In this tutorial, we are going to write an activity that is able to preview, start and stop video capturing, and give more explanation on it than the basic documentation does.
We are going to do this for Android 2.1, but after that, we will discuss differences with 2.2.
Finally, we will be illustrating how the undocumented 2.1 non-public api on MediaRecorder can be called through reflection.
This article is aimed at Android developers.
Setting the permissions
Since we are going to use the camera, the following line will definitely need to be declared in our AndroidManifest file:
<uses-permission android:name="android.permission.CAMERA" />
If we dont specify this, we will get a “Permission Denied” exception as soon as we try to access the camera from our code.
It is good practice to tell the app what features of the camera we are going to use too:
<uses-feature android:name="android.hardware.camera"/> <uses-feature:name="android.hardware.camera.autofocus"/>
If we dont specify these however, the app will just assume that all camera features are used(camera, autofocus and flash). So to just make it work, we dont need to declare these.
We are also going to record audio during the video capture. So we also declare:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
Setting up the camera preview
Before we are going to discuss the actual video capturing, we are going to make sure that everything the camera is seeing is previewed to the screen.
Surfaceview is a special type of view that basically gives you a surface to draw too. Its used in various scenarios, such as to draw 2D or 3D objects to, or to play videos.
In this case, we are going to draw the camera input to such a a surfaceview, so the user is able to preview the video/sees what he is recording.
We define a camera_surface.xml layout file in which we setup the surfaceview:
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent"> <SurfaceView android:id="@+id/surface_camera" xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:layout_centerInParent="true" android:layout_weight="1"> </SurfaceView> </RelativeLayout>
The following activity will then use the surfaceview in the above layout xml and start rendering the camera input to the screen:
public class CustomVideoCamera extends Activity implements SurfaceHolder.Callback{ private static final String TAG = "CAMERA_TUTORIAL"; private SurfaceView surfaceView; private SurfaceHolder surfaceHolder; private Camera camera; private boolean previewRunning; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.camera_surface); surfaceView = (SurfaceView) findViewById(R.id.surface_camera); surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); } @Override public void surfaceCreated(SurfaceHolder holder) { camera = Camera.open(); if (camera != null){ Camera.Parameters params = camera.getParameters(); camera.setParameters(params); } else { Toast.makeText(getApplicationContext(), "Camera not available!", Toast.LENGTH_LONG).show(); finish(); } } @Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { if (previewRunning){ camera.stopPreview(); } Camera.Parameters p = camera.getParameters(); p.setPreviewSize(width, height); p.setPreviewFormat(PixelFormat.JPEG); camera.setParameters(p); try { camera.setPreviewDisplay(holder); camera.startPreview(); previewRunning = true; } catch (IOException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); } } @Override public void surfaceDestroyed(SurfaceHolder holder) { camera.stopPreview(); previewRunning = false; camera.release(); } }
The main thing we are doing here is implementing a SurfaceHolder.Callback. This callback enables us to intervene when our surface is created, changed(format or size changes) or destroyed. Without this callback, our screen would just remain black.
After the surface is created, we obviously want to display what the camera is seeing. First, we are getting a reference to the camera by calling the static method Camera.open(). We only need to do this once, so we put this in the surfaceCreated method.
The actual start of the preview happens in the surfaceChanged method. This is because this method will not only be called right after surface creation(the first “change”), but also everytime something essential to the surface changes, and we want to stop previewing then, change some parameters and restart the preview. For example, we are using the passed width and height to set the preview size. By putting all of this in the surfaceChanged method, we are making sure our preview always remains consistent with our surface.
When the surface is destroyed(this happens for example at onPause or onDestroy of the activity), we are releasing the camera again, because otherwise other apps, like the native camera app, will start giving “Camera already in use” exceptions.
On a final note,
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
means the surface is not going to own its buffers, and this surface type is typically used for camera stuff.
Capturing the video
We are now adding the following method to our activity, which will be called when the user decides to start recording:
private MediaRecorder mediaRecorder; private final int maxDurationInMs = 20000; private final long maxFileSizeInBytes = 500000; private final int videoFramesPerSecond = 20; public boolean startRecording(){ try { camera.unlock(); mediaRecorder = new MediaRecorder(); mediaRecorder.setCamera(camera); mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT); mediaRecorder.setMaxDuration(maxDurationInMs); tempFile = new File(getCacheDir(),cacheFileName); mediaRecorder.setOutputFile(tempFile.getPath()); mediaRecorder.setVideoFrameRate(videoFramesPerSecond); mediaRecorder.setVideoSize(surfaceView.getWidth(), surfaceView.getHeight()); mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT); mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface()); mediaRecorder.setMaxFileSize(maxFileSizeInBytes); mediaRecorder.prepare(); mediaRecorder.start(); return true; } catch (IllegalStateException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); return false; } catch (IOException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); return false; } }
In this method we are preparing the MediaRecorder with all the necessary details.
First, we unlock the camera so we can pass it in a usable state to another process, in this case the recording process. We are doing this in the third line of the code.
Then we are setting all the properties of the MediaRecorder.
Two things are important here.
The order in which the methods are called is the first one. For example, we need to set the sources before setting the encoders and we have to set encoders before calling prepare.
The second important and less documented one, is that ALL properties have to be set. Prepare is a very sensitive and obscure method. The implementation is a native function that just returns an error code in case something goes wrong. So, for example, if you forget to set “maxDuration” on the above mediaRecorder, you will get some obscure “prepare failed” error on most devices, which will not give you any hint at all you didnt set the maxDuration property. Many people assume that these properties are not required at all, and are getting these hard to debug errors.
After preparing the recorder, we start the actual recording.
Stop recording
Then we stop recording in the following method:
public void stopRecording(){ mediaRecorder.stop(); camera.lock(); }
which speaks for itself.
Android 2.1 vs 2.2
At the time of this writing, most Android devices are still running on 2.1 and most developers are aiming their apps to be compatible with 2.1 and above. Which makes sense, if one looks at some Android platform distribution information.
The reference documentation is already updated for 2.2 though.
If we take a look again at the official instructions again,
we notice that we have gone through all the steps mentioned there. We clarified some steps, like “passing a fully initialized SurfaceHolder” and we also took care of the “see Media recorder information” part.
But some things we did different too. We are looking at the 2.2 instructions, and some methods are not yet available in 2.1.
In general, the camera API has been changing/improving at lightning speed. The downside to this is that old apis are getting deprecated very fast and that you cant just use the latest api, since you would seriously hurt your potential number of customers on the market.
Portrait orientation
In 2.2, the setDisplayOrientation method is there, but it isnt in 2.1. Actually, portrait mode for capturing videos through the api is only supported since 2.2, as clearly stated in the New Developer APIs paragraph of Android 2.2 highlights.
So, for our activity, it is necessary to specify
android:screenOrientation="landscape"
Otherwise, it is likely that the camera will have a 90 degrees discrepancy with what the user is seeing(which can be changed by setting the rotation parameter on the camera, but hacking into the code to make the camera work with portrait mode on 2.1 is outside the scope of this tutorial).
Reconnect
Another method that is not there yet in 2.1, so we are obviously not calling it.
PixelFormat.JPEG
This constant, which we are using in our activity above, is already deprecrated in 2.2. But since ImageFormat.JPEG, the suggested replacement, is not there yet in 2.1, we are forced to use the deprecated api.
Calling the undocumented setParameters method on MediaRecorder
In 2.2, there are setters for the properties videoBitrate, audioBitrate, audioChannels and audioSamplingRate on MediaRecorder.
In 2.1, these properties cant be set officially.
If we take a look at the VideoCamera implementation at the android source code, in the 2.1 tree, we find code like:
mMediaRecorder.setParameters(String.format("video-param-encoding-bitrate=%d", mProfile.mVideoBitrate)); mMediaRecorder.setParameters(String.format("audio-param-encoding-bitrate=%d", mProfile.mAudioBitrate)); mMediaRecorder.setParameters(String.format("audio-param-number-of-channels=%d", mProfile.mAudioChannels)); mMediaRecorder.setParameters(String.format("audio-param-sampling-rate=%d", mProfile.mAudioSamplingRate));
Unfortunately, although it is present on all 2.1 devices as far as I know, the setParameters method is not part of the public API, so 2.1 developers are left in the cold there.
Luckily, there is a workaround.
When preparing the MediaRecorder you can add the following lines:
Method[] methods = mediaRecorder.getClass().getMethods(); for (Method method: methods){ if (method.getName().equals("setParameters")){ try { method.invoke(mediaRecorder, String.format("video-param-encoding-bitrate=%d", 360000)); method.invoke(mediaRecorder, String.format("audio-param-encoding-bitrate=%d", 23450)); method.invoke(mediaRecorder, String.format("audio-param-number-of-channels=%d", 1)); method.invoke(mediaRecorder, String.format("audio-param-sampling-rate=%d",8000)); } catch (IllegalArgumentException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); } catch (IllegalAccessException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); } catch (InvocationTargetException e) { Log.e(TAG,e.getMessage()); e.printStackTrace(); } } }
Through reflection, we are iterating over the available methods on the MediaRecorder. If we find the setParameters method, we invoke the found method for the same effect as in the camera app for the android 2.1 source code.
Very helpful; thank you!
Thanks for the excellent walkthrough, much easier to follow than Google’s docs.
In my test app I put the start/stop on buttons on the app menu which was just a few lines of code in onOptionsItemSelected and onCreateOptionsMenu
Was this answer helpful?
LikeDislikeHi DeltaFlux,
Can you help me with an ‘unexpected application stopped’ failure. I pretty much did how you described? My application just does not start, I am not sure how to go about debugging if my app does not even start.
Did you do anything different? I noticed that the Camera calls (lock, unlock) are reversed to how it is described in google doc, but then I could be wrong.
None of my logcat calls will show up
thx
-Malay
Was this answer helpful?
LikeDislikeHi u mention about onOptionsItemSelected and onCreateOptionsMenu this sounds familiar from another website.. So i decide to integrate it into the code above but it seems like there’s lot of error, can u tell me how should i integrate the above code in this web to onOptionsItemSelected and onCreateOptionsMenu to make my app work..
Was this answer helpful?
LikeDislikeHey, anyone know why i allways have “prepare failed” ? on emulator i can’t runn my app but device it’s work with bugs.. Video with green or purple lines/bloky (((( . And i run my app on 2 devices : on first app record video with bugs on second crashed on “startRecording button ”
Man pls tell me how to fix it .
P.s regards ,, Peter.
Was this answer helpful?
LikeDislikeWhat device are you testing on? I’ve come across similar problems on older devices. The solution was to develop for 1.5, but the API is even more limited then.
Was this answer helpful?
LikeDislikeGreat tutorial! Could you give some advice on how to capture without preview and how to record video in the background? I’m wondering about a use-case where I enable preview manually (during application setup when I fix the device) and then on I start/stop recording using a widget (so I can interact with other apps). Thank you!
Was this answer helpful?
LikeDislikepublic class CustomVideoCamera extends Activity implements SurfaceHolder.Callback
Is there something else i forgot to do in the .manifest file because i kept getting error when i run the file, if so what is the steps i should perform to solve it? And by the way the codes under Capturing the video, where should i code it, is it under the same activity??
Was this answer helpful?
LikeDislikeIs it possible to implement the above codes to run in the background doing video recording, via sms or call comes in the above acitivity won’t get destroyed..
Was this answer helpful?
LikeDislike…. the user feature line for the autofocus has a typo sorry for the last comment
Was this answer helpful?
LikeDislikeThanks, its perfect
Was this answer helpful?
LikeDislikePhone crash problem. I have tried this example and written my own and have the same problem. The problem
seems to be related to the MediaRecorder. When I open the Video Recorder app and make multiple recordings after 10 or more recordings the shutter button sound will stop. After this happens if you press the volume key the phone will crash. Does anyone have an idea about this? In LogCat there will be an AudioFlinger “cannot create track”
I am using a Samsung Galaxy S with Android 2.2 running. If this issue is similar to one you have had please respond, I can give more info if needed.
Note that I have also written an app that only does sound and I have no problems. ALSO, if I do not actually start recording, but do run MediaRecorder.prepare() there are no problems.
Was this answer helpful?
LikeDislikeNote: if you comment out all of the video recording specific stuff from the mediarecorder it will record audio only fine with out any of the problems I mentioned.
Was this answer helpful?
LikeDislikeThanks guys. It is very useful to overcome my long headache. But i use the code as follow to set the properties and it is working fine,
Method[] methods = recorder.getClass().getMethods();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoFrameRate(24);
recorder.setVideoSize(720, 480);
for (Method method: methods){
try{
if (method.getName().equals(“setAudioChannels”)){
method.invoke(recorder, String.format(“audio-param-number-of-channels=%d”, 1));
}
else if(method.getName().equals(“setAudioEncodingBitRate”)){
method.invoke(recorder,12200);
}
else if(method.getName().equals(“setVideoEncodingBitRate”)){
method.invoke(recorder, 3000000);
}
else if(method.getName().equals(“setAudioSamplingRate”)){
method.invoke(recorder,8000);
}
else if(method.getName().equals(“setVideoFrameRate”)){
method.invoke(recorder,24);
}
}catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
}
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
Was this answer helpful?
LikeDislikethanks for the detailed procedure. forward-porting to 2.3, 3.x, 4.x should be fairly straightforward, i’d imagine.
here’s a question: can this application run in the background? i.e., can you capture video with the camera while using another app?
thanks again!
Was this answer helpful?
LikeDislikeis it not supposed to work on virtual device?? i didnot run it on phone but im getting force closes can anyone help……i bascally copied the entire code given here……need HELP!!
Was this answer helpful?
LikeDislikethanks for useful tutorial .. i have question . when captured video how can i transfer the video directly to PC via WIFI OR BLUETOOTH .. thanks
any source code .. will be good
Was this answer helpful?
LikeDislikeHello,
is it possible record a video from within a background service without a preview?
I’m trying to do so, but it always leeds to bugs. If I don’t set a preview for camera and media recorder, I get a media server died , camera died, ICamera died and error 100. If I try to use a dummy view which is not shown for mVideoCamera.setPreviewDisplay(fakeView.mHolder) this leads to an exception which tells that a null was passed as surface! But I have created the fakeView . What can I do with Android 2.3.3? Please help.
Thanks in advance!
Was this answer helpful?
LikeDislikehi I have a little problem writing codes above, I defined camera of type Camera but all of methods written above(open,getParameters,…) are undefined for it:(
what should I do?
I have imported
import java.io.IOException;
import android.app.Activity;
import android.content.Intent;
import android.graphics.Camera;
import android.graphics.PixelFormat;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.widget.Toast;
Was this answer helpful?
LikeDislikeStill the first part throws null point exception i need your expertise to sort this out….
Was this answer helpful?
LikeDislike