首页 > 解决方案 > 无法根据 TFLite 官方示例加载 TFLite 模型

问题描述

我正在尝试使用 TFLite 创建一个智能回复应用程序,并且我正在遵循 来自 github的预构建示例。

当从 git 克隆引用的项目并编译时,它可以完美运行。

但是,当我将引用的项目代码(还有 gradle 依赖项、资产、库和其他东西)复制到我的项目时,它无法加载 tflite 模型并引发运行时错误:

E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.legendsayantan.replai, PID: 14279
java.lang.UnsatisfiedLinkError: No implementation found for long com.legendsayantan.replai.SmartReplyClient.loadJNI(java.nio.MappedByteBuffer, java.lang.String[]) (tried Java_com_legendsayantan_replai_SmartReplyClient_loadJNI and Java_com_legendsayantan_replai_SmartReplyClient_loadJNI__Ljava_nio_MappedByteBuffer_2_3Ljava_lang_String_2)
    at com.legendsayantan.replai.SmartReplyClient.loadJNI(Native Method)
    at com.legendsayantan.replai.SmartReplyClient.loadModel(SmartReplyClient.java:64)
    at com.legendsayantan.replai.MainActivity.lambda$onStart$0(MainActivity.java:90)
    at com.legendsayantan.replai.-$$Lambda$MainActivity$Xdq7R5vPx_buuatNOneWHck6N2o.run(Unknown Source:0)
    at android.os.Handler.handleCallback(Handler.java:888)
    at android.os.Handler.dispatchMessage(Handler.java:100)
    at android.os.Looper.loop(Looper.java:213)
    at android.app.ActivityThread.main(ActivityThread.java:8178)
    at java.lang.reflect.Method.invoke(Native Method)
    at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:513)
    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1101)

这是我的 MainActivity.java-

import android.content.Context;
import android.content.SharedPreferences;
import android.os.Bundle;
import android.os.Handler;
import android.util.Log;

import android.view.Menu;


import com.google.android.material.navigation.NavigationView;

import androidx.navigation.NavController;
import androidx.navigation.Navigation;
import androidx.navigation.ui.AppBarConfiguration;
import androidx.navigation.ui.NavigationUI;
import androidx.drawerlayout.widget.DrawerLayout;
import androidx.appcompat.app.AppCompatActivity;
import androidx.appcompat.widget.Toolbar;

import org.tensorflow.lite.Interpreter;

public class MainActivity extends AppCompatActivity {

private AppBarConfiguration mAppBarConfiguration;
public static SharedPreferences sharedPreferences;
public static Context context;

public static final String TAG = "SmartReply";
public static SmartReplyClient client;
public static Handler handler;
public static Interpreter model;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    context=getApplicationContext();
    setContentView(R.layout.activity_main);

    client = new SmartReplyClient(getApplicationContext());
    handler = new Handler();

    sharedPreferences = getPreferences(Context.MODE_PRIVATE);
    Toolbar toolbar = findViewById(R.id.toolbar);
    setSupportActionBar(toolbar);
    DrawerLayout drawer = findViewById(R.id.drawer_layout);
    NavigationView navigationView = findViewById(R.id.nav_view);
    // Passing each menu ID as a set of Ids because each
    // menu should be considered as top level destinations.
    mAppBarConfiguration = new AppBarConfiguration.Builder(
            R.id.nav_home, R.id.nav_gallery, R.id.nav_slideshow)
            .setDrawerLayout(drawer)
            .build();
    NavController navController = Navigation.findNavController(this, R.id.nav_host_fragment);
    NavigationUI.setupActionBarWithNavController(this, navController, mAppBarConfiguration);
    NavigationUI.setupWithNavController(navigationView, navController);

}

@Override
public boolean onCreateOptionsMenu(Menu menu) {
    // Inflate the menu; this adds items to the action bar if it is present.
    getMenuInflater().inflate(R.menu.main, menu);
    return true;
}

@Override
public boolean onSupportNavigateUp() {
    NavController navController = Navigation.findNavController(this, R.id.nav_host_fragment);
    return NavigationUI.navigateUp(navController, mAppBarConfiguration)
            || super.onSupportNavigateUp();
}

@Override
protected void onStart() {
    super.onStart();
    Log.v(TAG, "onStart");
    handler.post(
            () -> {
               client.loadModel();
            });
}

@Override
protected void onStop() {
    super.onStop();
    Log.v(TAG, "onStop");
    handler.post(
            () -> {
                client.unloadModel();
            });
}

private static void send(final String message) {
    handler.post(
            () -> {
                StringBuilder textToShow = new StringBuilder();
                textToShow.append("Input: ").append(message).append("\n\n");

                // Get suggested replies from the model.
                SmartReply[] ans = client.predict(new String[] {message});
                for (SmartReply reply : ans) {
                    textToShow.append("Reply: ").append(reply.getText()).append("\n");
                }
                textToShow.append("------").append("\n");


            });
      }
}

这是 SmartReplyClient.java(与参考 github 项目完全相同的文件):

import android.content.Context;
import android.content.res.AssetFileDescriptor;

import androidx.annotation.Keep;
import androidx.annotation.WorkerThread;

import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.List;


public class SmartReplyClient implements AutoCloseable {
  private static final String TAG = "SmartReplyDemo";
  private static final String MODEL_PATH = "smartreply.tflite";
  private static final String BACKOFF_PATH = "backoff_response.txt";
  private static final String JNI_LIB = "smartreply_jni";

  private final Context context;
  private long storage;
  private MappedByteBuffer model;

  private volatile boolean isLibraryLoaded;

  public SmartReplyClient(Context context) {
    this.context = context;
  }

  public boolean isLoaded() {
    return storage != 0;
  }

  @WorkerThread
  public synchronized void loadModel() {
    if (!isLibraryLoaded) {
      System.loadLibrary(JNI_LIB);
      isLibraryLoaded = true;
    }

    try {
      model = loadModelFile();
      String[] backoff = loadBackoffList();
      storage = loadJNI(model, backoff); //This line is throwing the error
      // But this same java file works nice in the reference project
    } catch (Exception e) {
      System.out.println(e.getMessage());
      return;
    }
  }

  @WorkerThread
  public synchronized SmartReply[] predict(String[] input) {
    if (storage != 0) {
      return predictJNI(storage, input);
    } else {
      return new SmartReply[] {};
    }
  }

  @WorkerThread
  public synchronized void unloadModel() {
     close();
  }

  @Override
  public synchronized void close() {
    if (storage != 0) {
      unloadJNI(storage);
      storage = 0;
    }
  }

  public MappedByteBuffer loadModelFile() throws IOException {
     try (AssetFileDescriptor fileDescriptor =
         AssetsUtil.getAssetFileDescriptorOrCached(context, MODEL_PATH);
         FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor())) {
         FileChannel fileChannel = inputStream.getChannel();
         long startOffset = fileDescriptor.getStartOffset();
         long declaredLength = fileDescriptor.getDeclaredLength();
         return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
     }
  }

   private String[] loadBackoffList() throws IOException {
    List<String> labelList = new ArrayList<String>();
    try (BufferedReader reader =
        new BufferedReader(new InputStreamReader(context.getAssets().open(BACKOFF_PATH)))) {
      String line;
      while ((line = reader.readLine()) != null) {
        if (!line.isEmpty()) {
          labelList.add(line);
        }
      }
    }
    String[] ans = new String[labelList.size()];
    labelList.toArray(ans);
    return ans;
  }

  @Keep
  private native long loadJNI(MappedByteBuffer buffer, String[] backoff);

  @Keep
  private native SmartReply[] predictJNI(long storage, String[] text);

  @Keep
  private native void unloadJNI(long storage);
}

build.gradle 我还实现了与参考示例相同的 tensorflow 版本:

implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'

在这里,它抛出了上面提到的错误。

为避免该错误,我还尝试使用Interpreter in 活动加载 TFLite 模型onCreate

Interpreter interpreter;
        try {
            interpreter=new Interpreter(loadmodelfile());
        } catch (IOException e) {
            e.printStackTrace();
        }

方法loadmodelfile()

public MappedByteBuffer loadmodelfile() throws IOException {
        AssetFileDescriptor assetFileDescriptor = this.getAssets().openFd("smartreply.tflite");
        FileInputStream fileInputStream = new FileInputStream(assetFileDescriptor.getFileDescriptor());
        FileChannel fileChannel = fileInputStream.getChannel();
        long startoff = assetFileDescriptor.getStartOffset();
        long length = assetFileDescriptor.getDeclaredLength();
        return fileChannel.map(FileChannel.MapMode.READ_ONLY,startoff,length);
    }

仍然存在,我得到了这个错误:

E/AndroidRuntime: FATAL EXCEPTION: main
    Process: com.legendsayantan.tflitesmartreplyremake, PID: 10879
    java.lang.RuntimeException: Unable to start activity ComponentInfo{com.legendsayantan.tflitesmartreplyremake/com.legendsayantan.tflitesmartreplyremake.MainActivity}: java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: Encountered unresolved custom op: Normalize.
    Node number 0 (Normalize) failed to prepare.
    
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3782)
        at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3961)
        at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:91)
        at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:149)
        at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:103)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:2386)
        at android.os.Handler.dispatchMessage(Handler.java:107)
        at android.os.Looper.loop(Looper.java:213)
        at android.app.ActivityThread.main(ActivityThread.java:8178)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:513)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1101)
     Caused by: java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: Encountered unresolved custom op: Normalize.
    Node number 0 (Normalize) failed to prepare.
    
        

请让我知道我遗漏的东西或我在任何这些尝试中犯的任何错误。

我是 Tensorflow 的新手,但不是创建 android 应用程序的新手,我只是想不通我在这里弄错了什么。

任何帮助或建议将不胜感激!

标签: javaandroidtensorflowtensorflow-lite

解决方案


Encountered unresolved custom op: Normalize.

Looks like your model has a custom op, Normalize, which means that you need to implement your own TFLite custom op and register it to the TFLite interpreter. If you have no plans to implement the custom op, please consider using the Select TF op option, which can leverage the TensorFlow ops on mobile.

https://www.tensorflow.org/lite/guide/ops_select


推荐阅读