首页 > 解决方案 > 优化每个条目的大型数组中文件处理搜索的运行时间

问题描述

我用 Laravel 开发了一个文件上传和处理。但是运行时间很长。

文件看起来像这样:(非常大,每个文件大约 50k 手)

QhQs3s2s@86,QdQs3s2s@86,QcQs3s2s@86,KhKs3s2s@100,KdKs3s2s@100,KcKs3s2s@100,AhAs3s2s@86,AdAs3s2s@86,AcAs3s2s@86

它通过 txt 上传上传,然后分成 1000 个“手”组

/**
 * Upload the create Files
 */
public function uploadFile(Request $request)
{
    // process SituationName
    $name = $request->input('name');
    $situation = Situation::firstOrCreate(['name' => $name, 'active' => 1]);

    //process RaiseRange
    $action = Action::where('name', 'Raise')->first();
    $path = $request->file('rangeRaise')->store('ranges');

    //Split Files
    $content = Storage::disk('local')->get($path);
    $array = explode(",", $content);
    $arrayFinal = array_chunk($array, 1000);

    foreach($arrayFinal as $arrayJob){
        $filename = 'ranges/RaiseFile'.uniqid().'.txt';
        Storage::disk('local')->put($filename, json_encode($arrayJob));
        ProcessRangeFiles::dispatch($action, $situation, $filename);
    }
}

然后它被作为具有以下句柄的作业发送

public function handle()
{
    Log::info('File Processing started');
    $array = null;
    $content = null;
    $found = null;

    $path = $this->path;
    $action = $this->action;
    $situation = $this->situation;

    $hands = Hand::all();

    $content = json_decode(Storage::disk('local')->get($path));

    foreach ($content as $key=>$line){
        $array[$key] = explode('@', $line);
        foreach($hands as $hand){
            if($hand->hand == $array[$key][0]){
                $found = $hand;
                break;
            }
        }
        DB::table('hands_to_situations_to_actions')->insert(
            ['hand_id' => $found->id, 'action_id' => $action->id, 'situation_id' => $situation->id, 'percentage' => $array[$key][1], 'created_at' => Carbon::now()->toDateTimeString(), 'updated_at' => Carbon::now()->toDateTimeString()]
        );
    }
    Log::info('File Processing finished');
}

$hands 充满了所有可能的奥马哈扑克牌。

有谁知道如何优化这段代码?然后每 1000 个块的运行时间约为 12 分钟。

标签: phplaraveloptimization

解决方案


推荐阅读