I am trying to transfer audio from the microphone of an Android device to the server via TCP. The problem is that I get an error message on the console. A TCP connection has been made, but no audio data is being sent.
I understand that this may be due to a poor choice of codec, because some of them should be able to search in the stream, which is impossible. I can really use any codec that works, but I read that MediaRecorder.OutputFormat.RAW_AMR and MediaRecorder.AudioEncoder.AMR_NB were the best combination for streaming. Please suggest another alternative if there is a better one.
Here is what I see in the magazine:
11-06 11:09:27.276 22983-22983/se.jensolsson.test.test D/ ViewRootImpl@5ed8717 [MainActivity]: ViewPostImeInputStage processPointer 0 11-06 11:09:27.355 22983-22983/se.jensolsson.test.test D/ ViewRootImpl@5ed8717 [MainActivity]: ViewPostImeInputStage processPointer 1 11-06 11:09:27.387 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setup 11-06 11:09:27.394 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setAudioSource(1) 11-06 11:09:27.397 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setAudioEncoder(1) 11-06 11:09:27.400 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setOutputFile 11-06 11:09:27.400 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: prepare 11-06 11:09:27.407 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: start 11-06 11:09:27.408 22983-25466/se.jensolsson.test.test E/MediaRecorder: start failed: -38 11-06 11:09:27.408 22983-25466/se.jensolsson.test.test W/System.err: java.lang.IllegalStateException 11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at android.media.MediaRecorder._start(Native Method) 11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at android.media.MediaRecorder.start(MediaRecorder.java:1170) 11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at se.jensolsson.test.test.MainActivity$1$1.run(MainActivity.java:78) 11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at java.lang.Thread.run(Thread.java:762)
Here are the relevant parts of AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA"/> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/> <uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" /> <application android:allowBackup="true" android:icon="@mipmap/ic_launcher" android:label="@string/app_name" android:roundIcon="@mipmap/ic_launcher_round" android:supportsRtl="true" android:theme="@style/AppTheme"> <activity android:name=".MainActivity"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application>
Here is the source code:
public class MainActivity extends AppCompatActivity { private MediaRecorder mediaRecorder; private boolean permissionToRecordAccepted; private static final int REQUEST_RECORD_AUDIO_PERMISSION = 200; private ParcelFileDescriptor pfd; @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { super.onRequestPermissionsResult(requestCode, permissions, grantResults); switch (requestCode){ case REQUEST_RECORD_AUDIO_PERMISSION: permissionToRecordAccepted = grantResults[0] == PackageManager.PERMISSION_GRANTED; if (!permissionToRecordAccepted ) finish(); break; } } @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); ActivityCompat.requestPermissions(this, new String[] { Manifest.permission.RECORD_AUDIO }, REQUEST_RECORD_AUDIO_PERMISSION); Button buttonStartRecording = (Button)findViewById(R.id.button_start_recording); buttonStartRecording.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { new Thread(new Runnable() { @Override public void run() { try { Socket s = new Socket("10.0.83.8", 8888); ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(s); MediaRecorder recorder = new MediaRecorder(); recorder.setAudioSource(MediaRecorder.AudioSource.MIC); recorder.setOutputFormat(MediaRecorder.OutputFormat.RAW_AMR); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); recorder.setOutputFile(pfd.getFileDescriptor()); try { recorder.prepare(); } catch (IllegalStateException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } recorder.start(); } catch (Exception e) { e.printStackTrace(); } } }).start(); } }); } }
The device I'm running on is the Samsung Galaxy A5 with Android 7.0. I am using minSdkVersion 22 and targetSdkVersion 26 in the gradle file.
EDIT: The pre-installed voice recorder works fine. So I don’t see how it can happen that the microphone is busy.
EDIT 2: If I go to the next and save the file instead of the stream, it seems to work. Therefore, I still doubt that there is a problem with the audio format and streaming, since the network stream does not support search. If so, which format should I use?
//recorder.setOutputFile(pfd.getFileDescriptor()); File outputFile = File.createTempFile("test", "mp4", getApplicationContext().getCacheDir()); recorder.setOutputFile(outputFile.getPath());
EDIT 3 None of the answers are correct. Now I have found that the main problem is that I cannot save the audio data in the stream created by ParcelFileDescriptor.fromSocket
It works, however, if I do
ParcelFileDescriptor[] mParcelFileDescriptors = ParcelFileDescriptor.createPipe(); final ParcelFileDescriptor mParcelRead = new ParcelFileDescriptor(mParcelFileDescriptors[0]); ParcelFileDescriptor mParcelWrite = new ParcelFileDescriptor(mParcelFileDescriptors[1]);
And then send the contents of the stream to the server. I don’t know if there are any problems with this question or if it can damage certain audio file formats, as I think the bytes in the header can change at any time depending on the format.