skip to Main Content

I am trying understand Artificial Intelligence Neural Network and I am self-learner. Hope anyone would help me in understanding on how to solve this problem

If this post should be posted here. Please comment instead of degrading the post. Appreciate for this as well.

I have a question that I am totally confused about how to solve it. I encountered this online but was unable to understand how to solve it. I have added the question below. Hope you can provide some help.

The data set contains 4 observations for 4 input variables (Temp, Pres, Flow, and Process) and an output variable (Rejects). The first column "No" is simply an identifier. The table below reproduces the first 4 observations:

No Temp Pres Flow Process Rejects
1 53.39 10.52 4.82 0 1.88
2 46.23 15.13 5.31 0 2.13
3 42.85 18.79 3.59 0 2.66
4 53.09 18.33 3.67 0 2.03

Train a back-propagation neural network on approximately 80% of the observations, randomly selected. Test the trained network using the remaining 20% observations.

Question:

  1. Based on this how to define a fixed neural network with output values and backpropagate an expected output pattern? Here, the output is only one which is the "Rejects" Column
  2. What are the error values which is required to be calculated?
  3. Does it required to define the hidden layer here? And how can we define the hidden layer?
  4. What type of “tool” can be used to create a report for the above inputs and get the expected output? Can you help related to this? I am unsure about one thing as well
  5. If not tool could you provide any program to understand this? Preferable tool though.
  6. Create a figure that plots the actual and predicted values of the output "Rejects" for the training and test data sets.
  7. Does this mean creating a chart something similar to the plot chart we create for the Support Vector Machine? Is that possible to create in the tool where we are using for the above question?
  8. How to solve -> Sum of squared errors for the training and test data sets.

I would really appreciate your help.

2

Answers


  1. Firstly, the dataset is insanely small. However, this is the way you would approach this kind of dataset, assuming there is much more data.

    import pandas as pd
    import tensorflow as tf
    from tensorflow.keras import Sequential
    from tensorflow.keras.layers import Dropout, Dense, Flatten
    
    data = {
        'No.': [1, 2, 3, 4],
        'Temp': [53.39, 46.23, 42.85, 53.09],
        'Pres': [10.52, 15.13, 18.79, 18.33],
        'Process': [0, 0, 0, 0],
        'Rejects': [1.88, 2.13, 2.66, 2.03]
    }
    
    df = pd.DataFrame(data)
    df = df.drop(['No.'], axis=1)
    features = df.drop(['Rejects'], axis=1)
    labels = df['Rejects']
    
    model = Sequential()
    model.add(Dense(1000, activation='relu', input_shape=(features.shape[1],1)))
    model.add(Dropout(0.2))
    model.add(Dense(250, activation='relu'))
    model.add(Dropout(0.2))
    model.add(Dense(50, activation='relu'))
    model.add(Dropout(0.2))
    model.add(Dense(1, activation='relu'))
    
    model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mse','mae','mape'])
    
    model.fit(features, labels, epochs=10)
    
    model.evaluate(features,labels)
    

    The results are not good, but that is only due to the quantity of data.

    1/1 [==============================] - 0s 316ms/step - loss: 2.4341 - mse: 2.4341 - mae: 1.4419 - mape: 67.6981
    
    Login or Signup to reply.
  2. So I’m writing a fresh answer because the code for this is tuned to suit the dataset URL that you have provided after the first answer. This time clearly the accuracy is much better due to the quantity of data available.

    import pandas as pd
    
    df = pd.read_csv('data.txt', sep='t')
    
    df = df.drop(['No.'], axis=1)
    features = df.drop(['Rejects'], axis=1)
    labels = df['Rejects']
    
    
    from sklearn.model_selection import train_test_split
    X_train, X_test, y_train, y_test = train_test_split(features, labels, test_size=0.3)
    
    
    import tensorflow as tf
    from tensorflow.keras import Sequential
    from tensorflow.keras.layers import Dropout, Dense, Flatten
    
    
    model = Sequential()
    model.add(Dense(2000, activation='relu', input_shape=(features.shape[1],1)))
    model.add(Dropout(0.2))
    model.add(Dense(500, activation='relu'))
    model.add(Dropout(0.2))
    model.add(Dense(100, activation='relu'))
    model.add(Dropout(0.2))
    model.add(Dense(1, activation='relu'))
    
    model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mse','mae','mape'])
    
    history = model.fit(X_train, y_train, epochs=50, validation_data = (X_test, y_test))
    
    import matplotlib.pyplot as plt
    
    plt.plot(history.history['mape'])
    plt.plot(history.history['val_mape'])
    plt.xlabel('Epochs')
    plt.ylabel('Percentage Loss')
    plt.legend(['MAPE','Val_MAPE'])
    

    And the model training went something like this:

    Epoch 1/50
    7/7 [==============================] - 1s 73ms/step - loss: 10.9066 - mse: 10.9066 - mae: 2.4855 - mape: 111.4739 - val_loss: 5.2550 - val_mse: 5.2550 - val_mae: 2.2529 - val_mape: 99.9000
    Epoch 2/50
    7/7 [==============================] - 0s 23ms/step - loss: 4.9923 - mse: 4.9923 - mae: 2.1328 - mape: 95.1877 - val_loss: 3.1912 - val_mse: 3.1912 - val_mae: 1.6590 - val_mape: 74.3683
    Epoch 3/50
    7/7 [==============================] - 0s 24ms/step - loss: 4.0993 - mse: 4.0993 - mae: 1.9074 - mape: 84.9316 - val_loss: 3.0207 - val_mse: 3.0207 - val_mae: 1.6149 - val_mape: 72.3441
    Epoch 4/50
    7/7 [==============================] - 0s 23ms/step - loss: 3.5641 - mse: 3.5641 - mae: 1.6932 - mape: 75.8205 - val_loss: 3.0053 - val_mse: 3.0053 - val_mae: 1.5755 - val_mape: 68.8496
    Epoch 5/50
    7/7 [==============================] - 0s 23ms/step - loss: 2.9217 - mse: 2.9217 - mae: 1.5616 - mape: 69.8578 - val_loss: 2.4539 - val_mse: 2.4539 - val_mae: 1.4140 - val_mape: 62.5867
    Epoch 6/50
    7/7 [==============================] - 0s 21ms/step - loss: 2.4518 - mse: 2.4518 - mae: 1.4247 - mape: 63.5009 - val_loss: 2.0144 - val_mse: 2.0144 - val_mae: 1.2820 - val_mape: 56.4856
    Epoch 7/50
    7/7 [==============================] - 0s 21ms/step - loss: 1.9910 - mse: 1.9910 - mae: 1.2630 - mape: 56.0590 - val_loss: 1.6839 - val_mse: 1.6839 - val_mae: 1.1723 - val_mape: 50.4525
    Epoch 8/50
    7/7 [==============================] - 0s 20ms/step - loss: 1.1813 - mse: 1.1813 - mae: 0.9188 - mape: 40.1967 - val_loss: 0.7452 - val_mse: 0.7452 - val_mae: 0.7067 - val_mape: 29.3356
    Epoch 9/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.8689 - mse: 0.8689 - mae: 0.7326 - mape: 32.4377 - val_loss: 0.3546 - val_mse: 0.3546 - val_mae: 0.4791 - val_mape: 21.1433
    Epoch 10/50
    7/7 [==============================] - 0s 21ms/step - loss: 1.0251 - mse: 1.0251 - mae: 0.8172 - mape: 36.6930 - val_loss: 0.5519 - val_mse: 0.5519 - val_mae: 0.6279 - val_mape: 28.5509
    Epoch 11/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.8735 - mse: 0.8735 - mae: 0.7236 - mape: 32.9642 - val_loss: 1.0568 - val_mse: 1.0568 - val_mae: 0.8415 - val_mape: 36.3284
    Epoch 12/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.7933 - mse: 0.7933 - mae: 0.6918 - mape: 30.8646 - val_loss: 0.5851 - val_mse: 0.5851 - val_mae: 0.5987 - val_mape: 25.3339
    Epoch 13/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.5194 - mse: 0.5194 - mae: 0.5638 - mape: 24.8541 - val_loss: 0.2628 - val_mse: 0.2628 - val_mae: 0.4087 - val_mape: 17.7300
    Epoch 14/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.4954 - mse: 0.4954 - mae: 0.5518 - mape: 24.4398 - val_loss: 0.3021 - val_mse: 0.3021 - val_mae: 0.4256 - val_mape: 17.8031
    Epoch 15/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.4629 - mse: 0.4629 - mae: 0.5339 - mape: 23.4556 - val_loss: 0.2119 - val_mse: 0.2119 - val_mae: 0.3771 - val_mape: 16.3196
    Epoch 16/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.4563 - mse: 0.4563 - mae: 0.5222 - mape: 23.0115 - val_loss: 0.2919 - val_mse: 0.2919 - val_mae: 0.4207 - val_mape: 17.4477
    Epoch 17/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.4153 - mse: 0.4153 - mae: 0.5046 - mape: 22.5874 - val_loss: 0.5661 - val_mse: 0.5661 - val_mae: 0.6011 - val_mape: 25.0547
    Epoch 18/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.4056 - mse: 0.4056 - mae: 0.4932 - mape: 21.9288 - val_loss: 0.4406 - val_mse: 0.4406 - val_mae: 0.5216 - val_mape: 21.5496
    Epoch 19/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.4677 - mse: 0.4677 - mae: 0.5323 - mape: 23.2442 - val_loss: 0.2383 - val_mse: 0.2383 - val_mae: 0.3868 - val_mape: 16.3032
    Epoch 20/50
    7/7 [==============================] - 0s 23ms/step - loss: 0.3991 - mse: 0.3991 - mae: 0.4907 - mape: 21.4421 - val_loss: 0.2270 - val_mse: 0.2270 - val_mae: 0.3835 - val_mape: 16.4031
    Epoch 21/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.4039 - mse: 0.4039 - mae: 0.5030 - mape: 22.4905 - val_loss: 0.3142 - val_mse: 0.3142 - val_mae: 0.4375 - val_mape: 18.1178
    Epoch 22/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.3628 - mse: 0.3628 - mae: 0.4799 - mape: 21.5093 - val_loss: 0.3639 - val_mse: 0.3639 - val_mae: 0.4683 - val_mape: 19.1954
    Epoch 23/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.3455 - mse: 0.3455 - mae: 0.4649 - mape: 20.5179 - val_loss: 0.2378 - val_mse: 0.2378 - val_mae: 0.3864 - val_mape: 16.1129
    Epoch 24/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.3276 - mse: 0.3276 - mae: 0.4523 - mape: 19.8604 - val_loss: 0.2182 - val_mse: 0.2182 - val_mae: 0.3768 - val_mape: 15.9712
    Epoch 25/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.3175 - mse: 0.3175 - mae: 0.4485 - mape: 20.0487 - val_loss: 0.3083 - val_mse: 0.3083 - val_mae: 0.4336 - val_mape: 17.9253
    Epoch 26/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.3289 - mse: 0.3289 - mae: 0.4514 - mape: 20.1608 - val_loss: 0.3361 - val_mse: 0.3361 - val_mae: 0.4495 - val_mape: 18.4325
    Epoch 27/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.3233 - mse: 0.3233 - mae: 0.4471 - mape: 19.8604 - val_loss: 0.2534 - val_mse: 0.2534 - val_mae: 0.4036 - val_mape: 17.1102
    Epoch 28/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.3226 - mse: 0.3226 - mae: 0.4441 - mape: 19.3694 - val_loss: 0.2483 - val_mse: 0.2483 - val_mae: 0.3982 - val_mape: 16.7979
    Epoch 29/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.3188 - mse: 0.3188 - mae: 0.4439 - mape: 19.5424 - val_loss: 0.2392 - val_mse: 0.2392 - val_mae: 0.3908 - val_mape: 16.4567
    Epoch 30/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.3109 - mse: 0.3109 - mae: 0.4457 - mape: 19.7457 - val_loss: 0.2292 - val_mse: 0.2292 - val_mae: 0.3859 - val_mape: 16.4228
    Epoch 31/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2999 - mse: 0.2999 - mae: 0.4337 - mape: 19.1884 - val_loss: 0.2527 - val_mse: 0.2527 - val_mae: 0.3966 - val_mape: 16.4614
    Epoch 32/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.3091 - mse: 0.3091 - mae: 0.4313 - mape: 18.9708 - val_loss: 0.2601 - val_mse: 0.2601 - val_mae: 0.4023 - val_mape: 16.6517
    Epoch 33/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2974 - mse: 0.2974 - mae: 0.4363 - mape: 19.4105 - val_loss: 0.2839 - val_mse: 0.2839 - val_mae: 0.4175 - val_mape: 17.1926
    Epoch 34/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2786 - mse: 0.2786 - mae: 0.4177 - mape: 18.4687 - val_loss: 0.1865 - val_mse: 0.1865 - val_mae: 0.3689 - val_mape: 16.4617
    Epoch 35/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.3164 - mse: 0.3164 - mae: 0.4466 - mape: 19.8367 - val_loss: 0.3088 - val_mse: 0.3088 - val_mae: 0.4362 - val_mape: 18.0655
    Epoch 36/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.3097 - mse: 0.3097 - mae: 0.4339 - mape: 19.2173 - val_loss: 0.2615 - val_mse: 0.2615 - val_mae: 0.4002 - val_mape: 16.4560
    Epoch 37/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2861 - mse: 0.2861 - mae: 0.4249 - mape: 18.7808 - val_loss: 0.2223 - val_mse: 0.2223 - val_mae: 0.3794 - val_mape: 16.0339
    Epoch 38/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2967 - mse: 0.2967 - mae: 0.4334 - mape: 19.1338 - val_loss: 0.1935 - val_mse: 0.1935 - val_mae: 0.3679 - val_mape: 16.0777
    Epoch 39/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.3012 - mse: 0.3012 - mae: 0.4307 - mape: 18.8958 - val_loss: 0.2027 - val_mse: 0.2027 - val_mae: 0.3718 - val_mape: 16.1167
    Epoch 40/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.2926 - mse: 0.2926 - mae: 0.4204 - mape: 18.5881 - val_loss: 0.2174 - val_mse: 0.2174 - val_mae: 0.3810 - val_mape: 16.5433
    Epoch 41/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2947 - mse: 0.2947 - mae: 0.4214 - mape: 18.9445 - val_loss: 0.3573 - val_mse: 0.3573 - val_mae: 0.4648 - val_mape: 19.0649
    Epoch 42/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.3088 - mse: 0.3088 - mae: 0.4332 - mape: 19.3028 - val_loss: 0.2762 - val_mse: 0.2762 - val_mae: 0.4090 - val_mape: 16.7506
    Epoch 43/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.2898 - mse: 0.2898 - mae: 0.4235 - mape: 18.5388 - val_loss: 0.2007 - val_mse: 0.2007 - val_mae: 0.3747 - val_mape: 16.6556
    Epoch 44/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2835 - mse: 0.2835 - mae: 0.4168 - mape: 18.4563 - val_loss: 0.2329 - val_mse: 0.2329 - val_mae: 0.3871 - val_mape: 16.3445
    Epoch 45/50
    7/7 [==============================] - 0s 19ms/step - loss: 0.2685 - mse: 0.2685 - mae: 0.4109 - mape: 18.3725 - val_loss: 0.2807 - val_mse: 0.2807 - val_mae: 0.4141 - val_mape: 16.9569
    Epoch 46/50
    7/7 [==============================] - 0s 20ms/step - loss: 0.2783 - mse: 0.2783 - mae: 0.4205 - mape: 18.4501 - val_loss: 0.2055 - val_mse: 0.2055 - val_mae: 0.3726 - val_mape: 16.0784
    Epoch 47/50
    7/7 [==============================] - 0s 21ms/step - loss: 0.2712 - mse: 0.2712 - mae: 0.4225 - mape: 18.8953 - val_loss: 0.2424 - val_mse: 0.2424 - val_mae: 0.3906 - val_mape: 16.3056
    Epoch 48/50
    7/7 [==============================] - 0s 17ms/step - loss: 0.2623 - mse: 0.2623 - mae: 0.4113 - mape: 18.3200 - val_loss: 0.2274 - val_mse: 0.2274 - val_mae: 0.3821 - val_mape: 16.0680
    Epoch 49/50
    7/7 [==============================] - 0s 17ms/step - loss: 0.2629 - mse: 0.2629 - mae: 0.4026 - mape: 17.8561 - val_loss: 0.2516 - val_mse: 0.2516 - val_mae: 0.3948 - val_mape: 16.2785
    Epoch 50/50
    7/7 [==============================] - 0s 18ms/step - loss: 0.2641 - mse: 0.2641 - mae: 0.3969 - mape: 17.4429 - val_loss: 0.2531 - val_mse: 0.2531 - val_mae: 0.4031 - val_mape: 17.0205
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search