首页 > 解决方案 > 免费的 Mysql 或 Python 缓存

问题描述

我用 blob 更新了一个 Mysql 数据库。超过 2000 个 blob,它变得非常慢。(2000 块 = 44 秒,3000 块 = 5 分钟,10 秒)。我将 blob 切成 2000 个列表,但无济于事。我在 2000 个 blob 之后让它进入睡眠状态,但无济于事。我怀疑这是缓存问题(Python 还是 Mysql?)。谢谢帮助。

import mysql.connector
from datetime import datetime 
import time

start_time_update = datetime.now()

connection=mysql.connector.connect(user='root', password ='12345', database='sakila')

try:
    cursor=connection.cursor()
    sql=("select picture from staff where staff_id=1")
    cursor.execute(sql)
    result=cursor.fetchone()[0]

    #Creating x pictures

    for i in range(3000):
        with open('Sortie/mon_fichier_blob_'+str(i)+'.jpg', 'wb') as f:
            f.write(result)

    sql=("CREATE TABLE IF NOT EXISTS test_2 (ID INT NOT NULL AUTO_INCREMENT, `ID_author` INT NULL,`ID_publisher` INT NULL, `Image` BLOB NULL, `nom` VARCHAR(200) NULL,PRIMARY KEY (ID));")
    cursor.execute(sql)
    connection.commit()

    fileName = []
    for i in range(3000):
        id=0
        id_aut= i**2
        id_pub = i
        img = None
        nom='Sortie/mon_fichier_blob_'+str(i)+'.jpg'
        tp=(id, id_aut,id_pub,img,nom)
        fileName.append(tp)

    for ind in fileName:
        sql="""INSERT INTO test_2 VALUES (%s,%s,%s,%s,%s)"""
        cursor.execute(sql,ind)
    connection.commit()

    album = dict()
    for i in range(3000):
        with open('Sortie/mon_fichier_blob_'+str(i)+'.jpg', 'rb') as f:
            album['Sortie/mon_fichier_blob_'+str(i)+'.jpg']=f.read()

    list_blob=[]
    for key, value in album.items():
        adq=(value, key)
        list_blob.append(adq)

    for blob in list_blob:
        sql = """UPDATE test_2 SET Image = %s WHERE nom  = %s"""
        cursor.execute(sql, blob)
        connection.commit()

finally:
    connection.close()

time_elapsed_update = datetime.now() - start_time_update 

print('Time elapsed Update Method (hh:mm:ss.ms) {}'.format(time_elapsed_update)) 

标签: pythonmysqlcachingblob

解决方案


推荐阅读