首页 > 解决方案 > 为什么我的代码每次都给出不同的结果?

问题描述

几天后,我的代码每次都可以重现——现在不行了!我不知道发生了什么,我只是更改了几行代码,但我不知道如何修复它!

# Code reproduzierbar machen
import numpy as np
import os
import random as rn
import tensorflow as tf
import keras
from keras import backend as K

#-----------------------------Keras reproducible------------------#
SEED = 1234

tf.set_random_seed(SEED)
os.environ['PYTHONHASHSEED'] = str(SEED)
np.random.seed(SEED)
rn.seed(SEED)

session_conf = tf.ConfigProto(
    intra_op_parallelism_threads=1, 
    inter_op_parallelism_threads=1
)
sess = tf.Session(
    graph=tf.get_default_graph(), 
    config=session_conf
)
K.set_session(sess)

# Importiere Datasets (Training und Test)
import pandas as pd
poker_train = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-training-true.data")
poker_test = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-testing.data")

X_tr = poker_train.iloc[:, 0:10]
y_train = poker_train.iloc[:, 10:11]
X_te = poker_test.iloc[:, 0:10]
y_test = poker_test.iloc[:, 10:11]

from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_tr)
X_test = sc.transform(X_te)

# NN mit Keras erstellen
import keras
from keras.models import Sequential
from keras.layers import Dense

nen = Sequential()
nen.add(Dense(100, input_dim = 10, activation = 'relu'))
nen.add(Dense(50, activation = 'relu'))
nen.add(Dense(10, activation = 'softmax'))

# Kompilieren
from keras import metrics
nen.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
nen_fit = nen.fit(X_train, y_train,epochs=2, batch_size=50, verbose=1, validation_split = 0.2, shuffle = False)

我只用了 2 个 epoch,这样我就可以立即看到我的输出是否相同,通常会有 500 个 epoch。第一行代码使它直到今天都可以重现,但现在不行了!我用 X_te 和 X_tr 更改了部分,因为首先我使用 y_train 和 y_test 类进行了 OneHotEncoding,但现在我不这样做了。我还将激活函数从 sigmoid 更改为 relu,将优化器从 RMSprop 更改为 adam。我不知道该怎么办:(

标签: pythontensorflowneural-network

解决方案


推荐阅读