扫二维码与项目经理沟通
我们在微信上24小时期待你的声音
解答本文疑问/技术咨询/运营咨询/技术建议/互联网交流
这篇文章主要为大家展示了“Android6.0如何实现双向通话自动录音功能”,内容简而易懂,条理清晰,希望能够帮助大家解决疑惑,下面让小编带领大家一起研究并学习一下“Android6.0如何实现双向通话自动录音功能”这篇文章吧。
网站建设哪家好,找创新互联!专注于网页设计、网站建设、微信开发、微信小程序开发、集团企业网站建设等服务项目。为回馈新老客户创新互联还提供了烈山免费建站欢迎大家使用!
具体如下:
项目中需要实现基于Android 6.0 的双向通话自动录音功能,在查阅相关android电话状态监听文章以及Git上的开源录音项目后,整理出此文
首先,实现android 电话状态的监听(来电和去电)
实现手机电话状态的监听,主要依靠两个类:
TelephoneManger和PhoneStateListener
TelephonseManger提供了取得手机基本服务的信息的一种方式。因此应用程序可以使用TelephonyManager来探测手机基本服务的情况。应用程序可以注册listener来监听电话状态的改变。
我们不能对TelephonyManager进行实例化,只能通过获取服务的形式:
Context.getSystemService(Context.TELEPHONY_SERVICE);
注意:对手机的某些信息进行读取是需要一定许可(permission)的。
主要静态成员常量:(它们对应PhoneStateListener.LISTEN_CALL_STATE所监听到的内容)
int CALL_STATE_IDLE //空闲状态,没有任何活动。 int CALL_STATE_OFFHOOK //摘机状态,至少有个电话活动。该活动或是拨打(dialing)或是通话,或是 on hold。并且没有电话是ringing or waiting int CALL_STATE_RINGING //来电状态,电话铃声响起的那段时间或正在通话又来新电,新来电话不得不等待的那段时间。
项目中使用服务来监听通话状态,所以需要弄清楚手机通话状态在广播中的对应值:
EXTRA_STATE_IDLE //它在手机通话状态改变的广播中,用于表示CALL_STATE_IDLE状态,即空闲状态。 EXTRA_STATE_OFFHOOK //它在手机通话状态改变的广播中,用于表示CALL_STATE_OFFHOOK状态,即摘机状态。 EXTRA_STATE_RINGING //它在手机通话状态改变的广播中,用于表示CALL_STATE_RINGING状态,即来电状态 ACTION_PHONE_STATE_CHANGED //在广播中用ACTION_PHONE_STATE_CHANGED这个Action来标示通话状态改变的广播(intent)。 //注:需要许可READ_PHONE_STATE。 String EXTRA_INCOMING_NUMBER //在手机通话状态改变的广播,用于从extra取来电号码。 String EXTRA_STATE //在通话状态改变的广播,用于从extra取来通话状态。
如何实现电话监听呢?
Android在电话状态改变是会发送action为android.intent.action.PHONE_STATE的广播,而拨打电话时会发送action为
public static final String ACTION_NEW_OUTGOING_CALL = "android.intent.action.NEW_OUTGOING_CALL";
的广播。通过自定义广播接收器,接受上述两个广播便可。
下面给出Java代码:(其中的Toast均为方便测试而添加)
package com.example.hgx.phoneinfo60.Recording; import android.content.BroadcastReceiver; import android.content.Context; import android.content.Intent; import android.telephony.TelephonyManager; import android.widget.Toast; /** * Created by hgx on 2016/6/13. */ public class PhoneCallReceiver extends BroadcastReceiver { private int lastCallState = TelephonyManager.CALL_STATE_IDLE; private boolean isIncoming = false; private static String contactNum; Intent audioRecorderService; public PhoneCallReceiver() { } @Override public void onReceive(Context context, Intent intent) { //如果是去电 if (intent.getAction().equals(Intent.ACTION_NEW_OUTGOING_CALL)){ contactNum = intent.getExtras().getString(Intent.EXTRA_PHONE_NUMBER); }else //android.intent.action.PHONE_STATE.查了下android文档,貌似没有专门用于接收来电的action,所以,非去电即来电. { String state = intent.getExtras().getString(TelephonyManager.EXTRA_STATE); String phoneNumber = intent.getExtras().getString(TelephonyManager.EXTRA_INCOMING_NUMBER); int stateChange = 0; if (state.equals(TelephonyManager.EXTRA_STATE_IDLE)){ //空闲状态 stateChange =TelephonyManager.CALL_STATE_IDLE; if (isIncoming){ onIncomingCallEnded(context,phoneNumber); }else { onOutgoingCallEnded(context,phoneNumber); } }else if (state.equals(TelephonyManager.EXTRA_STATE_OFFHOOK)){ //摘机状态 stateChange = TelephonyManager.CALL_STATE_OFFHOOK; if (lastCallState != TelephonyManager.CALL_STATE_RINGING){ //如果最近的状态不是来电响铃的话,意味着本次通话是去电 isIncoming =false; onOutgoingCallStarted(context,phoneNumber); }else { //否则本次通话是来电 isIncoming = true; onIncomingCallAnswered(context, phoneNumber); } }else if (state.equals(TelephonyManager.EXTRA_STATE_RINGING)){ //来电响铃状态 stateChange = TelephonyManager.CALL_STATE_RINGING; lastCallState = stateChange; onIncomingCallReceived(context,contactNum); } } } protected void onIncomingCallStarted(Context context,String number){ Toast.makeText(context,"Incoming call is started",Toast.LENGTH_LONG).show(); context.startService(new Intent(context,AudioRecorderService.class)); } protected void onOutgoingCallStarted(Context context,String number){ Toast.makeText(context, "Outgoing call is started", Toast.LENGTH_LONG).show(); context.startService(new Intent(context, AudioRecorderService.class)); } protected void onIncomingCallEnded(Context context,String number){ Toast.makeText(context, "Incoming call is ended", Toast.LENGTH_LONG).show(); context.startService(new Intent(context, AudioRecorderService.class)); } protected void onOutgoingCallEnded(Context context,String number){ Toast.makeText(context, "Outgoing call is ended", Toast.LENGTH_LONG).show(); context.startService(new Intent(context, AudioRecorderService.class)); } protected void onIncomingCallReceived(Context context,String number){ Toast.makeText(context, "Incoming call is received", Toast.LENGTH_LONG).show(); } protected void onIncomingCallAnswered(Context context, String number) { Toast.makeText(context, "Incoming call is answered", Toast.LENGTH_LONG).show(); } }
下面是AudioRecorderService的java实现:
package com.example.hgx.phoneinfo60.Recording; import android.app.Service; import android.content.Intent; import android.media.AudioFormat; import android.media.AudioRecord; import android.media.MediaRecorder; import android.os.AsyncTask; import android.os.Environment; import android.os.IBinder; import android.provider.MediaStore; import android.util.Log; import android.widget.Toast; import com.example.hgx.phoneinfo60.MyApplication; import java.io.DataOutputStream; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.FileOutputStream; import java.io.IOException; import java.net.HttpURLConnection; import java.net.URL; /** * Created by hgx on 2016/6/13. */ public class AudioRecorderService extends Service { private static int RECORD_RATE = 0; private static int RECORD_BPP = 32; private static int RECORD_CHANNEL = AudioFormat.CHANNEL_IN_MONO; private static int RECORD_ENCODER = AudioFormat.ENCODING_PCM_16BIT; private AudioRecord audioRecorder = null; private Thread recordT = null; private Boolean isRecording = false; private int bufferEle = 1024, bytesPerEle = 2;// want to play 2048 (2K) since 2 bytes we use only 1024 2 bytes in 16bit format private static int[] recordRate ={44100 , 22050 , 11025 , 8000}; int bufferSize = 0; File uploadFile; @Override public IBinder onBind(Intent intent) { // TODO: Return the communication channel to the service. //maintain the relationship between the caller activity and the callee service, currently useless here return null; } @Override public void onDestroy() { if (isRecording){ stopRecord(); }else{ Toast.makeText(MyApplication.getContext(), "Recording is already stopped",Toast.LENGTH_SHORT).show(); } super.onDestroy(); } @Override public int onStartCommand(Intent intent, int flags, int startId) { if (!isRecording){ startRecord(); }else { Toast.makeText(MyApplication.getContext(), "Recording is already started",Toast.LENGTH_SHORT).show(); } return 1; } private void startRecord(){ audioRecorder = initializeRecord(); if (audioRecorder != null){ Toast.makeText(MyApplication.getContext(), "Recording is started",Toast.LENGTH_SHORT).show(); audioRecorder.startRecording(); }else return; isRecording = true; recordT = new Thread(new Runnable() { @Override public void run() { writeToFile(); } },"Recording Thread"); recordT.start(); } private void writeToFile(){ byte bDate[] = new byte[bufferEle]; FileOutputStream fos =null; File recordFile = createTempFile(); try { fos = new FileOutputStream(recordFile); } catch (FileNotFoundException e) { e.printStackTrace(); } while (isRecording){ audioRecorder.read(bDate,0,bufferEle); } try { fos.write(bDate); } catch (IOException e) { e.printStackTrace(); } try { fos.close(); } catch (IOException e) { e.printStackTrace(); } } //Following function converts short data to byte data private byte[] writeShortToByte(short[] sData) { int size = sData.length; byte[] byteArrayData = new byte[size * 2]; for (int i = 0; i < size; i++) { byteArrayData[i * 2] = (byte) (sData[i] & 0x00FF); byteArrayData[(i * 2) + 1] = (byte) (sData[i] >> 8); sData[i] = 0; } return byteArrayData; } //Creates temporary .raw file for recording private File createTempFile() { File tempFile = new File(Environment.getExternalStorageDirectory(), "aditi.raw"); return tempFile; } //Create file to convert to .wav format private File createWavFile() { File wavFile = new File(Environment.getExternalStorageDirectory(), "aditi_" + System.currentTimeMillis() + ".wav"); return wavFile; } /* * Convert raw to wav file * @param java.io.File temporay raw file * @param java.io.File destination wav file * @return void * * */ private void convertRawToWavFile(File tempFile, File wavFile) { FileInputStream fin = null; FileOutputStream fos = null; long audioLength = 0; long dataLength = audioLength + 36; long sampleRate = RECORD_RATE; int channel = 1; long byteRate = RECORD_BPP * RECORD_RATE * channel / 8; String fileName = null; byte[] data = new byte[bufferSize]; try { fin = new FileInputStream(tempFile); fos = new FileOutputStream(wavFile); audioLength = fin.getChannel().size(); dataLength = audioLength + 36; createWaveFileHeader(fos, audioLength, dataLength, sampleRate, channel, byteRate); while (fin.read(data) != -1) { fos.write(data); } uploadFile = wavFile.getAbsoluteFile(); } catch (FileNotFoundException e) { //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); } catch (IOException e) { //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); } catch (Exception e) { //Log.e("MainActivity:convertRawToWavFile",e.getMessage()); } } /* * To create wav file need to create header for the same * * @param java.io.FileOutputStream * @param long * @param long * @param long * @param int * @param long * @return void */ private void createWaveFileHeader(FileOutputStream fos, long audioLength, long dataLength, long sampleRate, int channel, long byteRate) { byte[] header = new byte[44]; header[0] = 'R'; // RIFF/WAVE header header[1] = 'I'; header[2] = 'F'; header[3] = 'F'; header[4] = (byte) (dataLength & 0xff); header[5] = (byte) ((dataLength >> 8) & 0xff); header[6] = (byte) ((dataLength >> 16) & 0xff); header[7] = (byte) ((dataLength >> 24) & 0xff); header[8] = 'W'; header[9] = 'A'; header[10] = 'V'; header[11] = 'E'; header[12] = 'f'; // 'fmt ' chunk header[13] = 'm'; header[14] = 't'; header[15] = ' '; header[16] = 16; // 4 bytes: size of 'fmt ' chunk header[17] = 0; header[18] = 0; header[19] = 0; header[20] = 1; // format = 1 header[21] = 0; header[22] = (byte) channel; header[23] = 0; header[24] = (byte) (sampleRate & 0xff); header[25] = (byte) ((sampleRate >> 8) & 0xff); header[26] = (byte) ((sampleRate >> 16) & 0xff); header[27] = (byte) ((sampleRate >> 24) & 0xff); header[28] = (byte) (byteRate & 0xff); header[29] = (byte) ((byteRate >> 8) & 0xff); header[30] = (byte) ((byteRate >> 16) & 0xff); header[31] = (byte) ((byteRate >> 24) & 0xff); header[32] = (byte) (2 * 16 / 8); // block align header[33] = 0; header[34] = 16; // bits per sample header[35] = 0; header[36] = 'd'; header[37] = 'a'; header[38] = 't'; header[39] = 'a'; header[40] = (byte) (audioLength & 0xff); header[41] = (byte) ((audioLength >> 8) & 0xff); header[42] = (byte) ((audioLength >> 16) & 0xff); header[43] = (byte) ((audioLength >> 24) & 0xff); try { fos.write(header, 0, 44); } catch (IOException e) { // TODO Auto-generated catch block //Log.e("MainActivity:createWavFileHeader()",e.getMessage()); } } /* * delete created temperory file * @param * @return void */ private void deletTempFile() { File file = createTempFile(); file.delete(); } /* * Initialize audio record * * @param * @return android.media.AudioRecord */ private AudioRecord initializeRecord() { short[] audioFormat = new short[]{AudioFormat.ENCODING_PCM_16BIT, AudioFormat.ENCODING_PCM_8BIT}; short[] channelConfiguration = new short[]{AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO}; for (int rate : recordRate) { for (short aFormat : audioFormat) { for (short cConf : channelConfiguration) { //Log.d("MainActivity:initializeRecord()","Rate"+rate+"AudioFormat"+aFormat+"Channel Configuration"+cConf); try { int buffSize = AudioRecord.getMinBufferSize(rate, cConf, aFormat); bufferSize = buffSize; if (buffSize != AudioRecord.ERROR_BAD_VALUE) { AudioRecord aRecorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, rate, cConf, aFormat, buffSize); if (aRecorder.getState() == AudioRecord.STATE_INITIALIZED) { RECORD_RATE = rate; //Log.d("MainActivity:InitializeRecord - AudioFormat",String.valueOf(aFormat)); //Log.d("MainActivity:InitializeRecord - Channel",String.valueOf(cConf)); //Log.d("MainActivity:InitialoizeRecord - rceordRate", String.valueOf(rate)); return aRecorder; } } } catch (Exception e) { //Log.e("MainActivity:initializeRecord()",e.getMessage()); } } } } return null; } /* * Method to stop and release audio record * * @param * @return void */ private void stopRecord() { if (null != audioRecorder) { isRecording = false; audioRecorder.stop(); audioRecorder.release(); audioRecorder = null; recordT = null; Toast.makeText(getApplicationContext(), "Recording is stopped", Toast.LENGTH_LONG).show(); } convertRawToWavFile(createTempFile(), createWavFile()); if (uploadFile.exists()) { //Log.d("AudioRecorderService:stopRecord()", "UploadFile exists"); } new UploadFile().execute(uploadFile); deletTempFile(); } }
以上是“Android6.0如何实现双向通话自动录音功能”这篇文章的所有内容,感谢各位的阅读!相信大家都有了一定的了解,希望分享的内容对大家有所帮助,如果还想学习更多知识,欢迎关注创新互联行业资讯频道!
我们在微信上24小时期待你的声音
解答本文疑问/技术咨询/运营咨询/技术建议/互联网交流