首页 > 解决方案 > TensorFlow Lite 2.0 高级 GPU 在 Android 上使用 C++

问题描述

我是 TensorFlow 的新手。我从源代码构建了 TensorFlow Lite 库。我尝试使用 TensorFlow 进行人脸识别。这是我项目的一部分。我必须使用 GPU 内存进行输入/输出,例如输入数据:opengl 纹理,输出数据:opengl 纹理。不幸的是,此信息已过时:https ://www.tensorflow.org/lite/performance/gpu_advanced 。我尝试使用 gpu::gl::InferenceBuilder 来构建 gpu::gl::InferenceRunner。我有问题。我不明白如何获得 GraphFloat32 (Model>) 格式和 TfLiteContext 的模型。

我的实验代码示例:

using namespace tflite::gpu;
using namespace tflite::gpu::gl;

const TfLiteGpuDelegateOptionsV2 options = {
        .inference_preference = TFLITE_GPU_INFERENCE_PREFERENCE_SUSTAINED_SPEED,
        .is_precision_loss_allowed = 1 // FP16
};

tfGPUDelegate = TfLiteGpuDelegateV2Create(&options);
if (interpreter->ModifyGraphWithDelegate(tfGPUDelegate) != kTfLiteOk) {
    __android_log_print(ANDROID_LOG_ERROR, "Tensorflow", "GPU Delegate hasn't been created");
    return ;
} else {
    __android_log_print(ANDROID_LOG_INFO, "Tensorflow", "GPU Delegate has been created");
}

InferenceEnvironmentOptions envOption;
InferenceEnvironmentProperties properties;
auto envStatus = NewInferenceEnvironment(envOption, &env, &properties);

if (envStatus.ok()){
     __android_log_print(ANDROID_LOG_INFO, "Tensorflow", "Inference environment has been created");
 } else {
    __android_log_print(ANDROID_LOG_ERROR, "Tensorflow", "Inference environment hasn't been created");
    __android_log_print(ANDROID_LOG_ERROR, "Tensorflow", "Message: %s", envStatus.error_message().c_str());
}

InferenceOptions builderOptions;
builderOptions.usage = InferenceUsage::SUSTAINED_SPEED;
builderOptions.priority1 = InferencePriority::MIN_LATENCY;
builderOptions.priority2 = InferencePriority::AUTO;
builderOptions.priority3 = InferencePriority::AUTO;

//The last part requires a model
//   GraphFloat32* graph;
//   TfLiteContext* tfLiteContex;
//
//   auto buildStatus = BuildModel(tfLiteContex, delegate_params, &graph);
//   if (buildStatus.ok()){}

标签: androidc++tensorflow2.0

解决方案



推荐阅读