android - 除非我在停止后留出足够的时间,否则为什么不会在下一场比赛中调用 onMarkerReached?
问题描述
(很抱歉一遍又一遍地修复它。)
我想在停止 AudioTrack 后立即播放新流。
但是,如果您在使用stop ()停止播放后几毫秒后未开始播放,则可能不会调用OnMarkerReached回调。
我写了示例代码。
按下按钮播放声音。
当按下按钮时,0.5秒的数据被写入AudioTrack,0.25秒后,接下来0.5秒的波形被onMarkerReached写入AudioTrack。播放的声音是 2 秒的声音。
如果在播放声音的同时按下按钮 2 秒钟,它将停止 (),冲洗 ()并开始下一次播放。此时可能不会调用OnMarkerReached ,除非在播放前插入额外的时间。
在两个安卓设备上进行了验证。
即使设备 A 有 0 秒的额外时间,也会调用OnMarkerReached 。但是,即使设备 B 有 10 毫秒的额外时间,也不会调用OnMarkerReached 。
为什么有时不调用onMarkerReached ?
这是示例代码。
MainActivity.java
package com.example.audiotrackexample;
import androidx.appcompat.app.AppCompatActivity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity implements AudioTrack.OnPlaybackPositionUpdateListener {
final String TAG = MainActivity.class.getSimpleName();
Button playBtn;
Button queryBtn;
TextView curPosTv;
TextView curPosTv2;
public class SinGenerator {
float freq;
int samplingRate;
int sampleCount;
public SinGenerator(float freq, int samplingRate){
this.freq = freq;
this.samplingRate = samplingRate;
this.sampleCount = 0;
}
public short generate(){
this.sampleCount++;
double t = (double)freq * sampleCount / samplingRate;
double sin = Math.sin(2.0 * Math.PI * t);
return (short) (sin*Short.MAX_VALUE);
}
}
static final int SAMPLING_RATE = 16000;
static final int AUDIO_DATA_FORMAT = AudioFormat.ENCODING_PCM_16BIT;
static final int CHANNEL = AudioFormat.CHANNEL_OUT_MONO;
static final int READ_WAVE_BUFFER_SIZE = SAMPLING_RATE / 2; // 0.5s
static final int AUDIO_TRACK_MIN_BUFFER_SIZE = SAMPLING_RATE;
int audioTrackBufferSize;
AudioTrack audioTrack;
SinGenerator sinGenerator = new SinGenerator(300, SAMPLING_RATE);
short[] wave;
short[] readWaveBuff = new short[READ_WAVE_BUFFER_SIZE];
int waveReadLen = 0;
Handler handler = new Handler(Looper.myLooper());
void setNextWave(){
System.arraycopy( wave, waveReadLen, readWaveBuff, 0, READ_WAVE_BUFFER_SIZE );
waveReadLen += READ_WAVE_BUFFER_SIZE;
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
playBtn = findViewById(R.id.playBtn);
queryBtn = findViewById(R.id.queryBtn);
curPosTv = findViewById(R.id.curPosTv);
curPosTv2 = findViewById(R.id.curPosTv2);
queryBtn.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
Log.d(TAG, "curPlaybackPos : " + audioTrack.getPlaybackHeadPosition());
curPosTv2.setText("curPlaybackPos : " + audioTrack.getPlaybackHeadPosition());
}
});
audioTrackBufferSize = AudioTrack.getMinBufferSize(
SAMPLING_RATE,
CHANNEL,
AUDIO_DATA_FORMAT);
if (audioTrackBufferSize < AUDIO_TRACK_MIN_BUFFER_SIZE) {
audioTrackBufferSize = AUDIO_TRACK_MIN_BUFFER_SIZE;
}
Log.d(TAG, "audioTrackBufferSize : " + audioTrackBufferSize);
audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
SAMPLING_RATE,
CHANNEL,
AUDIO_DATA_FORMAT,
audioTrackBufferSize,
AudioTrack.MODE_STREAM);
audioTrack.setPlaybackPositionUpdateListener(MainActivity.this);
wave = new short[READ_WAVE_BUFFER_SIZE * 4];
for (int i = 0; i < 4; i++) {
sinGenerator = new SinGenerator(300 + i * 100, SAMPLING_RATE);
for (int j = 0; j < READ_WAVE_BUFFER_SIZE; j++) {
wave[i * READ_WAVE_BUFFER_SIZE + j] = sinGenerator.generate();
}
}
playBtn.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
audioTrack.stop();
audioTrack.flush();
handler.postDelayed(new Runnable() {
@Override
public void run() {
waveReadLen = 0;
setNextWave();
audioTrack.setNotificationMarkerPosition(READ_WAVE_BUFFER_SIZE / 2);
audioTrack.write(readWaveBuff, 0, READ_WAVE_BUFFER_SIZE);
audioTrack.play();
curPosTv.setText("curPlaybackPos : " + audioTrack.getPlaybackHeadPosition());
}
},0); // device A is onMarkerReached called. but device B is not.
// }, 10); // device A is onMarkerReached called. but device B is not.
// }, 100); // Devices A and B call onMarkerReached.
}
});
}
@Override
public void onMarkerReached(AudioTrack track) {
curPosTv.setText("curPlaybackPos : " + audioTrack.getPlaybackHeadPosition());
if(waveReadLen == wave.length) {
audioTrack.stop();
audioTrack.flush();
Log.d(TAG, "finish playing");
} else {
setNextWave();
audioTrack.write(readWaveBuff, 0, READ_WAVE_BUFFER_SIZE);
int newMarkerPosition = audioTrack.getNotificationMarkerPosition();
if (waveReadLen == wave.length) {
newMarkerPosition += (READ_WAVE_BUFFER_SIZE / 2) + READ_WAVE_BUFFER_SIZE;
} else {
newMarkerPosition += READ_WAVE_BUFFER_SIZE;
}
audioTrack.setNotificationMarkerPosition(newMarkerPosition);
Log.d(TAG, "curPos : " + audioTrack.getPlaybackHeadPosition() + " markerPos : " + audioTrack.getNotificationMarkerPosition());
}
}
@Override
public void onPeriodicNotification(AudioTrack track) {
}
}
activity_main.xml
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity"
android:gravity="center"
android:orientation="vertical">
<Button
android:id="@+id/playBtn"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Audio Play">
</Button>
<TextView
android:id="@+id/curPosTv"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</TextView>
<Button
android:id="@+id/queryBtn"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Query marker position">
</Button>
<TextView
android:id="@+id/curPosTv2"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</TextView>
</LinearLayout>
构建.gradle
apply plugin: 'com.android.application'
android {
compileSdkVersion 29
buildToolsVersion "29.0.3"
defaultConfig {
applicationId "com.example.audiotrackexample"
minSdkVersion 21
targetSdkVersion 29
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
}
解决方案
onMarkerReached
是一个规范得很差的 API;行为似乎从 OEM 更改为 OEM,据我所见,更新间隔可以大到一秒(就像从 的返回值一样粗略AudioTrack.getNotificationMarkerPosition()
)。恕我直言,唯一可接受的用例onMarkerReached()
是媒体播放器类型控件的定期搜索栏更新。
将新内容流式传输到单个预先存在的AudioTrack
. 换句话说,您只需启动一个AudioTrack
,让它继续播放,并确保它永远不会饿死(您可以在轨道中插入静音以填补任何空白)。这样,您就可以决定“播放 1”如何转换为“播放 2”,并且您可以精确控制时间。
推荐阅读
- django - “无效的日期格式。它必须是 YYYY-MM-DD”——Django Rest 框架
- python - 为特定包覆盖我公司的专有 pypi 存储库(kaggle)
- firebase - 无法在 Flutter 中使用 Callable Cloud Function
- c++ - 这个程序如何做,但相反,模式
- java - 如何使用 Bearer 或 Apikey 关键字自动 onswagger java
- angularjs - 表单没有重复密钥仍然出现“无法解密防伪令牌”错误
- python - VSCode Python 扩展:扩展主机意外终止
- asp.net - 如何将 ASP.NET 4.8 ConfigBuilders 与 web.config 的 system.serviceModel 部分一起使用
- regex - 带有 If 条件问题的正则表达式
- dataframe - 无法从 spark 数据帧将数据加载到配置单元中