首页 > 解决方案 > SimpleRNN的多维输入结果效率不高

问题描述

我正在制作超级简单的 RNN 模型

它需要明显有规律的时间序列数据

([[1,2,3], [3, 4, 5],[5,6,7], [7, 8, 9],[9,10,11], [11, 12, 13],[13,14,15], [15, 16, 17],[17,18,19], [19, 20, 21],[21, 22, 23],[23, 24, 25],[25, 26, 27],[27, 28, 29],[29, 30, 31]])

并且需要前 10 个数组数据来预测下一个数据,但结果很糟糕

[[ 1.          2.          3.        ]
 [ 3.          4.          5.        ]
 [ 5.          6.          7.        ]
 [ 7.          8.          9.        ]
 [ 9.         10.         11.        ]
 [11.         12.         13.        ]
 [13.         14.         15.        ]
 [15.         16.         17.        ]
 [17.         18.         19.        ]
 [19.         20.         21.        ]
 [16.58571815 14.85821152 14.95420837] # predict from here below
 [16.53819847 13.39703369 13.26765823]
 [16.53710938 13.11023235 13.01197338]
 [16.53708267 13.06925201 12.98667526]
 [16.53708267 13.06376362 12.98433018]
 [16.53708267 13.06303596 12.98411465]
 [16.53708267 13.06293964 12.98409462]
 [16.53708267 13.06292725 12.98409176]
 [16.53708267 13.06292534 12.98409271]
 [16.53708267 13.06292534 12.98409271]]

为什么结果这么差???

在我的假设中,它应该很容易解决,非常简单的数字测验。

有什么好的方法来改进这个模型或我的代码有问题吗?

import tensorflow

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import SimpleRNN
import numpy as np
from tensorflow.keras.optimizers import SGD


ori_data = np.array([[1,2,3], [3, 4, 5],[5,6,7], [7, 8, 9],[9,10,11], [11, 12, 13],[13,14,15], [15, 16, 17],[17,18,19], [19, 20, 21],[21, 22, 23],
[23, 24, 25],[25, 26, 27],[27, 28, 29],[29, 30, 31]])

x = np.array(ori_data[:-1])
y = np.array(ori_data[1:])

print(x.shape) #(10,3)
print(y.shape) #(10,3)

x_train = np.array(x).reshape(14, 3, 1) 
y_train = np.array(y).reshape(14, 3, 1)

print(x_train.shape)
print(y_train.shape)

NUM_DIM = 20  
NUM_RNN = 10
epoch = 100
model = Sequential()
model.add(SimpleRNN(NUM_DIM, input_shape=(NUM_RNN, 1), return_sequences=True))
model.add(Dense(1, activation="linear"))  
model.compile(loss="mean_squared_error", optimizer="sgd")
model.summary()


history = model.fit(x_train, y_train, epochs=50, batch_size=12)


# get first 10 set
x_test = ori_data[0:NUM_RNN,]


NUM_DATA = 10
for i in range(NUM_DATA):
  
    y_pred = model.predict(x_test[-NUM_RNN:].reshape(NUM_RNN, 3, 1)) 
    res = y_pred[NUM_RNN-1][:,0].reshape(1,3)
    
    x_test = np.concatenate((x_test,res))

print(x_test)

更新

我尝试增加训练数据并按照评论更改优化器。

使用 1000 个训练数据并根据最新的 100 个数据进行预测

k = []
for i in range(1000):
    k = np.append(k,  np.array([i * 2, i * 2 + 1,i* 2 + 2]))
ori_data = k.reshape(1000,3)

x = np.array(ori_data[:-1])
y = np.array(ori_data[1:])


x_train = np.array(x).reshape(999, 3, 1)
y_train = np.array(y).reshape(999, 3, 1)


NUM_DIM = 20  
NUM_RNN = 100
epoch = 100
model = Sequential()

model.add(SimpleRNN(NUM_DIM, input_shape=(NUM_RNN, 1), return_sequences=True))
model.add(Dense(1, activation="linear"))  
model.compile(loss='mean_squared_error', optimizer=Adam(lr=0.01, beta_1=0.9, beta_2=0.999))
model.summary()

history = model.fit(x_train, y_train, epochs=50, batch_size=12)



# get first set
x_test = ori_data[0:NUM_RNN,]

NUM_DATA = 100
for i in range(NUM_DATA):
  
    y_pred = model.predict(x_test[-NUM_RNN:].reshape(NUM_RNN, 3, 1)) 
    res = y_pred[NUM_RNN-1][:,0].reshape(1,3)
    
    x_test = np.concatenate((x_test,res))

print(x_test)

然而结果并没有太大变化。

我有两个想法。

simpleRNN不适合这个目的吗?

或者

我的多维模型是错误的。

[[  0.           1.           2.        ]
 [  2.           3.           4.        ]
 [  4.           5.           6.        ]
 [  6.           7.           8.        ]
 [  8.           9.          10.        ]
 [ 10.          11.          12.        ]
 [ 12.          13.          14.        ]
 [ 14.          15.          16.        ]
 [ 16.          17.          18.        ]
 [ 18.          19.          20.        ]
 [ 20.          21.          22.        ]
 [ 22.          23.          24.        ]
 [ 24.          25.          26.        ]
 [ 26.          27.          28.        ]
 [ 28.          29.          30.        ]
 [ 30.          31.          32.        ]
 [ 32.          33.          34.        ]
 [ 34.          35.          36.        ]
 [ 36.          37.          38.        ]
 [ 38.          39.          40.        ]
 [ 40.          41.          42.        ]
 [ 42.          43.          44.        ]
 [ 44.          45.          46.        ]
 [ 46.          47.          48.        ]
 [ 48.          49.          50.        ]
 [ 50.          51.          52.        ]
 [ 52.          53.          54.        ]
 [ 54.          55.          56.        ]
 [ 56.          57.          58.        ]
 [ 58.          59.          60.        ]
 [ 60.          61.          62.        ]
 [ 62.          63.          64.        ]
 [ 64.          65.          66.        ]
 [ 66.          67.          68.        ]
 [ 68.          69.          70.        ]
 [ 70.          71.          72.        ]
 [ 72.          73.          74.        ]
 [ 74.          75.          76.        ]
 [ 76.          77.          78.        ]
 [ 78.          79.          80.        ]
 [ 80.          81.          82.        ]
 [ 82.          83.          84.        ]
 [ 84.          85.          86.        ]
 [ 86.          87.          88.        ]
 [ 88.          89.          90.        ]
 [ 90.          91.          92.        ]
 [ 92.          93.          94.        ]
 [ 94.          95.          96.        ]
 [ 96.          97.          98.        ]
 [ 98.          99.         100.        ]
 [100.         101.         102.        ]
 [102.         103.         104.        ]
 [104.         105.         106.        ]
 [106.         107.         108.        ]
 [108.         109.         110.        ]
 [110.         111.         112.        ]
 [112.         113.         114.        ]
 [114.         115.         116.        ]
 [116.         117.         118.        ]
 [118.         119.         120.        ]
 [120.         121.         122.        ]
 [122.         123.         124.        ]
 [124.         125.         126.        ]
 [126.         127.         128.        ]
 [128.         129.         130.        ]
 [130.         131.         132.        ]
 [132.         133.         134.        ]
 [134.         135.         136.        ]
 [136.         137.         138.        ]
 [138.         139.         140.        ]
 [140.         141.         142.        ]
 [142.         143.         144.        ]
 [144.         145.         146.        ]
 [146.         147.         148.        ]
 [148.         149.         150.        ]
 [150.         151.         152.        ]
 [152.         153.         154.        ]
 [154.         155.         156.        ]
 [156.         157.         158.        ]
 [158.         159.         160.        ]
 [160.         161.         162.        ]
 [162.         163.         164.        ]
 [164.         165.         166.        ]
 [166.         167.         168.        ]
 [168.         169.         170.        ]
 [170.         171.         172.        ]
 [172.         173.         174.        ]
 [174.         175.         176.        ]
 [176.         177.         178.        ]
 [178.         179.         180.        ]
 [180.         181.         182.        ]
 [182.         183.         184.        ]
 [184.         185.         186.        ]
 [186.         187.         188.        ]
 [188.         189.         190.        ]
 [190.         191.         192.        ]
 [192.         193.         194.        ]
 [194.         195.         196.        ]
 [196.         197.         198.        ]
 [198.         199.         200.        ]
 [406.66326904 264.12637329 234.36053467] # predict from here
 [478.32727051 264.20413208 264.20501709]
 [497.1206665  264.20425415 269.84753418]
 [518.88244629 264.20721436 285.3477478 ]
 [553.58422852 264.41403198 332.9395752 ]
 [605.4630127  275.01095581 335.0244751 ]
 [657.63604736 320.15002441 335.02252197]
 [686.33428955 336.98327637 335.01068115]
 [690.27746582 340.27520752 335.00494385]
 [690.53912354 341.20166016 335.00292969]
 [690.55493164 341.4654541  335.00231934]
 [690.5559082  341.54095459 335.00213623]
 [690.55596924 341.56268311 335.0020752 ]
 [690.55596924 341.56890869 335.0020752 ]
 [690.55596924 341.57073975 335.0020752 ]
 [690.55596924 341.57122803 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]
 [690.55596924 341.57141113 335.0020752 ]]

训练日志


Epoch 1/50
WARNING:tensorflow:Model was constructed with shape (None, 100, 1) for input Tensor("simple_rnn_input:0", shape=(None, 100, 1), dtype=float32), but it was called on an input with incompatible shape (None, 3, 1).
WARNING:tensorflow:Model was constructed with shape (None, 100, 1) for input Tensor("simple_rnn_input:0", shape=(None, 100, 1), dtype=float32), but it was called on an input with incompatible shape (None, 3, 1).
84/84 [==============================] - 0s 981us/step - loss: 1320584.7500
Epoch 2/50
84/84 [==============================] - 0s 875us/step - loss: 1286373.3750
Epoch 3/50
84/84 [==============================] - 0s 919us/step - loss: 1253214.8750
Epoch 4/50
84/84 [==============================] - 0s 863us/step - loss: 1220912.0000
Epoch 5/50
84/84 [==============================] - 0s 880us/step - loss: 1189455.5000
Epoch 6/50
84/84 [==============================] - 0s 1ms/step - loss: 1158770.0000
Epoch 7/50
84/84 [==============================] - 0s 872us/step - loss: 1128863.6250
Epoch 8/50
84/84 [==============================] - 0s 854us/step - loss: 1099460.8750
Epoch 9/50
84/84 [==============================] - 0s 878us/step - loss: 1070951.0000
Epoch 10/50
84/84 [==============================] - 0s 860us/step - loss: 1043161.9375
Epoch 11/50
84/84 [==============================] - 0s 867us/step - loss: 1016199.6250
Epoch 12/50
84/84 [==============================] - 0s 869us/step - loss: 989827.5000
Epoch 13/50
84/84 [==============================] - 0s 871us/step - loss: 964060.3125
Epoch 14/50
84/84 [==============================] - 0s 884us/step - loss: 939177.3125
Epoch 15/50
84/84 [==============================] - 0s 865us/step - loss: 914734.3750
Epoch 16/50
84/84 [==============================] - 0s 964us/step - loss: 891065.8750
Epoch 17/50
84/84 [==============================] - 0s 866us/step - loss: 867806.8750
Epoch 18/50
84/84 [==============================] - 0s 898us/step - loss: 845191.5000
Epoch 19/50
84/84 [==============================] - 0s 952us/step - loss: 822929.1875
Epoch 20/50
84/84 [==============================] - 0s 890us/step - loss: 801307.5000
Epoch 21/50
84/84 [==============================] - 0s 883us/step - loss: 780187.2500
Epoch 22/50
84/84 [==============================] - 0s 877us/step - loss: 759871.3125
Epoch 23/50
84/84 [==============================] - 0s 862us/step - loss: 739762.5625
Epoch 24/50
84/84 [==============================] - 0s 862us/step - loss: 719785.1250
Epoch 25/50
84/84 [==============================] - 0s 867us/step - loss: 701341.1875
Epoch 26/50
84/84 [==============================] - 0s 870us/step - loss: 682664.1250
Epoch 27/50
84/84 [==============================] - 0s 896us/step - loss: 666033.3125
Epoch 28/50
84/84 [==============================] - 0s 882us/step - loss: 646248.3750
Epoch 29/50
84/84 [==============================] - 0s 874us/step - loss: 628936.5625
Epoch 30/50
84/84 [==============================] - 0s 895us/step - loss: 611742.2500
Epoch 31/50
84/84 [==============================] - 0s 932us/step - loss: 594522.4375
Epoch 32/50
84/84 [==============================] - 0s 972us/step - loss: 577997.2500
Epoch 33/50
84/84 [==============================] - 0s 920us/step - loss: 563100.8125
Epoch 34/50
84/84 [==============================] - 0s 875us/step - loss: 547811.2500
Epoch 35/50
84/84 [==============================] - 0s 870us/step - loss: 531739.3125
Epoch 36/50
84/84 [==============================] - 0s 869us/step - loss: 517295.8125
Epoch 37/50
84/84 [==============================] - 0s 905us/step - loss: 506299.7188
Epoch 38/50
84/84 [==============================] - 0s 880us/step - loss: 498595.6562
Epoch 39/50
84/84 [==============================] - 0s 890us/step - loss: 531498.5000
Epoch 40/50
84/84 [==============================] - 0s 869us/step - loss: 516119.3750
Epoch 41/50
84/84 [==============================] - 0s 870us/step - loss: 502201.9688
Epoch 42/50
84/84 [==============================] - 0s 872us/step - loss: 487284.6250
Epoch 43/50
84/84 [==============================] - 0s 886us/step - loss: 472474.1875
Epoch 44/50
84/84 [==============================] - 0s 879us/step - loss: 458271.5938
Epoch 45/50
84/84 [==============================] - 0s 946us/step - loss: 444749.9375
Epoch 46/50
84/84 [==============================] - 0s 927us/step - loss: 431224.7812
Epoch 47/50
84/84 [==============================] - 0s 856us/step - loss: 417638.5625
Epoch 48/50
84/84 [==============================] - 0s 861us/step - loss: 406279.1562
Epoch 49/50
84/84 [==============================] - 0s 867us/step - loss: 394453.1562
Epoch 50/50
84/84 [==============================] - 0s 890us/step - loss: 384451.8750

更新

我使用 LSTM 而不是 simpleRNN

#model.add(SimpleRNN(NUM_DIM, input_shape=(NUM_RNN, 1), return_sequences=True))    
model.add(LSTM(NUM_DIM, activation=None, input_shape=(NUM_RNN, 1), return_sequences=True))

结果大大改善。

也许 simpleRNN 不适合这个目的,我猜。

[[  0.           1.           2.        ]
 [  2.           3.           4.        ]
 [  4.           5.           6.        ]
 [  6.           7.           8.        ]
 [  8.           9.          10.        ]
 [ 10.          11.          12.        ]
 [ 12.          13.          14.        ]
 [ 14.          15.          16.        ]
 [ 16.          17.          18.        ]
 [ 18.          19.          20.        ]
 [ 20.          21.          22.        ]
 [ 22.          23.          24.        ]
 [ 24.          25.          26.        ]
 [ 26.          27.          28.        ]
 [ 28.          29.          30.        ]
 [ 30.          31.          32.        ]
 [ 32.          33.          34.        ]
 [ 34.          35.          36.        ]
 [ 36.          37.          38.        ]
 [ 38.          39.          40.        ]
 [ 39.9719429   41.04755783  42.01205063] # predict from here below
 [ 41.96513367  43.08579636  44.03870392]
 [ 43.98114395  45.11688995  46.06534576]
 [ 46.02096558  47.14200974  48.0843277 ]
 [ 48.08501816  49.16280365  50.09306335]
 [ 50.17318726  51.18208694  52.09262466]
 [ 52.28491592  53.20392227  54.08646393]
 [ 54.41921997  55.23324966  56.0792923 ]
 [ 56.5747757   57.27529907  58.0763855 ]
 [ 58.75000763  59.3350029   60.08308411]
 [ 60.94314575  61.41643524  62.104496  ]
 [ 63.15228653  63.52256393  64.1452179 ]
 [ 65.37547302  65.65501404  66.20915985]
 [ 67.61073303  67.81411743  68.29927826]
 [ 69.85613251  69.9990921   70.41743469]
 [ 72.10982513  72.20815277  72.56438446]
 [ 74.37003326  74.4388504   74.73970795]
 [ 76.63514709  76.68830872  76.94197845]
 [ 78.90366364  78.95341492  79.16893768]
 [ 81.17423248  81.23103333  81.41772461]
 [ 83.44564819  83.51813507  83.68505859]
 [ 85.71686554  85.81192017  85.96752167]
 [ 87.98696136  88.10987854  88.26168823]
 [ 90.25514221  90.40979767  90.56433105]
 [ 92.52074432  92.70979309  92.87243652]
 [ 94.78321838  95.00827789  95.18333435]
 [ 97.042099    97.30395508  97.49468994]
 [ 99.29704285  99.59580231  99.80455017]
 [101.54774475 101.88302612 102.11128998]
 [103.79402924 104.16498566 104.41358948]
 [106.03572083 106.44125366 106.71047211]
 [108.2727356  108.71152496 109.00115204]
 [110.5050354  110.97562408 111.28508759]
 [112.73260498 113.23347473 113.56192017]
 [114.95548248 115.48503876 115.8314209 ]
 [117.17371368 117.73036957 118.09352112]
 [119.38736725 119.96955872 120.34823608]
 [121.59655762 122.20272827 122.59564209]
 [123.80136871 124.43003845 124.83587646]
 [126.00195312 126.6516571  127.06913757]
 [128.19842529 128.86779785 129.2956543 ]
 [130.39091492 131.0786438  131.51568604]
 [132.57955933 133.28440857 133.72949219]
 [134.76451111 135.48529053 135.93736267]
 [136.94590759 137.68148804 138.13954163]
 [139.12390137 139.87322998 140.33633423]
 [141.2986145  142.06069946 142.5280304 ]
 [143.47021484 144.24411011 144.71487427]
 [145.63882446 146.42364502 146.89710999]
 [147.80458069 148.5994873  149.07501221]
 [149.96762085 150.77185059 151.24880981]
 [152.1280365  152.94088745 153.41877747]
 [154.28598022 155.10676575 155.58509827]
 [156.44155884 157.26963806 157.74801636]
 [158.59490967 159.42967224 159.90769958]
 [160.74610901 161.58702087 162.06439209]
 [162.89524841 163.74182129 164.2182312 ]
 [165.04244995 165.8941803  166.36941528]
 [167.18782043 168.04425049 168.51806641]
 [169.33140564 170.19215393 170.66436768]
 [171.4733429  172.33796692 172.8085022 ]
 [173.61366272 174.4818573  174.95051575]
 [175.75247192 176.62388611 177.09059143]
 [177.88986206 178.76416016 179.2288208 ]
 [180.02584839 180.90278625 181.3653717 ]
 [182.16053772 183.03982544 183.5002594 ]
 [184.29397583 185.17538452 185.63366699]
 [186.42622375 187.30953979 187.765625  ]
 [188.55734253 189.44233704 189.89625549]
 [190.68737793 191.57385254 192.02560425]
 [192.81639099 193.7041626  194.15379333]
 [194.94444275 195.83332825 196.28085327]
 [197.07154846 197.96142578 198.40689087]
 [199.19773865 200.08851624 200.53193665]
 [201.3230896  202.21459961 202.65603638]
 [203.44761658 204.33973694 204.77923584]
 [205.5713501  206.46398926 206.90161133]
 [207.6943512  208.5874176  209.02322388]
 [209.81660461 210.71002197 211.144104  ]
 [211.93817139 212.83190918 213.26425171]
 [214.05905151 214.95300293 215.3837738 ]
 [216.17930603 217.07339478 217.50263977]
 [218.29896545 219.19311523 219.62091064]
 [220.41802979 221.31221008 221.73861694]
 [222.53651428 223.43067932 223.85575867]
 [224.65444946 225.54858398 225.97236633]
 [226.77186584 227.66589355 228.08853149]
 [228.88874817 229.78268433 230.2041626 ]
 [231.00515747 231.89897156 232.31942749]
 [233.12104797 234.01473999 234.43423462]
 [235.23651123 236.13000488 236.5486145 ]
 [237.35151672 238.24479675 238.66256714]
 [239.46606445 240.35916138 240.77612305]
 [241.5802002  242.47306824 242.88931274]
 [243.69392395 244.58657837 245.00213623]
 [245.80723572 246.699646   247.11462402]
 [247.92015076 248.81234741 249.22677612]
 [250.03268433 250.9246521  251.33860779]
 [252.14483643 253.03659058 253.45010376]
 [254.25662231 255.14816284 255.56130981]]

标签: pythontensorflowkeras

解决方案


我使用LTSM而不是simpleRNN.

问题已经解决了。

文章已更新。


推荐阅读