of the Android SDK
introduced a bunch of cool new features for developers. Though not as
glamorous as some APIs, the new audio manipulation classes - and - offer powerful
functionality to developers looking to manipulate raw audio.
These classes let you record audio directly from the audio input hardware of the device, and stream PCM audio buffers to the audio hardware for playback. Strong sauce indeed for those of you looking to have more control over audio input and playback.
Enough talk - on to the code. To test out these new APIs I put together a simple Android app that listens to 10 seconds of input from the microphone and then plays it back through the speaker in reverse. Perfect for decoding secret messages in .
Start with the recording code. It's designed to record the incoming audio to a file on the SD Card that we'll read and playback later. As per the latest security patch, your application requires a uses-permission to record audio.
The recording code here records a new set of 16bit mono audio at 11025Hz to reverseme.pcm on the SD card.
public void record() {
int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
File file = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/reverseme.pcm");
// Delete any previous recording.
if (file.exists())
file.delete();
// Create the new file.
try {
file.createNewFile();
} catch (IOException e) {
throw new IllegalStateException("Failed to create " + file.toString());
}
try {
// Create a DataOuputStream to write the audio data into the saved file.
OutputStream os = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(os);
DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (isRecording) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for (int i = 0; i < bufferReadResult; i++)
dos.writeShort(buffer[i]);
}
audioRecord.stop();
dos.close();
} catch (Throwable t) {
Log.e("AudioRecord","Recording Failed");
}
}
Next we create a playback method that reads the file and plays back the contents in reverse. It's important to set the audio data encoding (here PCM 16 bits), channel, and frequency values to the same settings used in the AudioRecord object. Then again, playing with the playback frequency might be have .
public void play() {
// Get the file we want to playback.
File file = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/reverseme.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int musicLength = (int)(file.length()/2);
short[] music = new short[musicLength];
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
music[musicLength-1-i] = dis.readShort();
i++;
}
// Close the input streams.
dis.close();
// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(music, 0, musicLength);
} catch (Throwable t) {
Log.e("AudioTrack","Playback Failed");
}
}
Finally, to drive this you need to update your application Activity to call the record and playback methods as appropriate. To keep this example as simple as possible I'm going to record for 10 seconds as soon as the application starts, and playback in reverse as soon as I've finished taking the sample.
To be more useful you'd almost certainly want to perform the playback operation in a Service and on a background thread.
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
Thread thread = new Thread(new Runnable() {
public void run() {
record();
}
});
thread.start();
try {
wait(10000);
} catch (InterruptedException e) {}
isRecording = false;
try {
thread.join();
} catch (InterruptedException e) {}
play();
finish();
}
The AudioTrack and AudioRecord classes offer a lot more functionality than I've demonstrated here. Using the AudioTrack streaming mode you can do processing of incoming audio and playback in near real time, letting you manipulate incoming or outgoing audio and perform signal processing on raw audio on the device.
Tell us how you've used these APIs in your Android apps in the comments!
These classes let you record audio directly from the audio input hardware of the device, and stream PCM audio buffers to the audio hardware for playback. Strong sauce indeed for those of you looking to have more control over audio input and playback.
Enough talk - on to the code. To test out these new APIs I put together a simple Android app that listens to 10 seconds of input from the microphone and then plays it back through the speaker in reverse. Perfect for decoding secret messages in .
Start with the recording code. It's designed to record the incoming audio to a file on the SD Card that we'll read and playback later. As per the latest security patch, your application requires a uses-permission to record audio.
The recording code here records a new set of 16bit mono audio at 11025Hz to reverseme.pcm on the SD card.
public void record() {
int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
File file = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/reverseme.pcm");
// Delete any previous recording.
if (file.exists())
file.delete();
// Create the new file.
try {
file.createNewFile();
} catch (IOException e) {
throw new IllegalStateException("Failed to create " + file.toString());
}
try {
// Create a DataOuputStream to write the audio data into the saved file.
OutputStream os = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(os);
DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (isRecording) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for (int i = 0; i < bufferReadResult; i++)
dos.writeShort(buffer[i]);
}
audioRecord.stop();
dos.close();
} catch (Throwable t) {
Log.e("AudioRecord","Recording Failed");
}
}
Next we create a playback method that reads the file and plays back the contents in reverse. It's important to set the audio data encoding (here PCM 16 bits), channel, and frequency values to the same settings used in the AudioRecord object. Then again, playing with the playback frequency might be have .
public void play() {
// Get the file we want to playback.
File file = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/reverseme.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int musicLength = (int)(file.length()/2);
short[] music = new short[musicLength];
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
music[musicLength-1-i] = dis.readShort();
i++;
}
// Close the input streams.
dis.close();
// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(music, 0, musicLength);
} catch (Throwable t) {
Log.e("AudioTrack","Playback Failed");
}
}
Finally, to drive this you need to update your application Activity to call the record and playback methods as appropriate. To keep this example as simple as possible I'm going to record for 10 seconds as soon as the application starts, and playback in reverse as soon as I've finished taking the sample.
To be more useful you'd almost certainly want to perform the playback operation in a Service and on a background thread.
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
Thread thread = new Thread(new Runnable() {
public void run() {
record();
}
});
thread.start();
try {
wait(10000);
} catch (InterruptedException e) {}
isRecording = false;
try {
thread.join();
} catch (InterruptedException e) {}
play();
finish();
}
The AudioTrack and AudioRecord classes offer a lot more functionality than I've demonstrated here. Using the AudioTrack streaming mode you can do processing of incoming audio and playback in near real time, letting you manipulate incoming or outgoing audio and perform signal processing on raw audio on the device.
Tell us how you've used these APIs in your Android apps in the comments!
Posted by: Reto Meier, EMEA Android Developer Advocate, Google UK