Vous êtes sur la page 1sur 21

Multi layer 70/30

Partition on 70%
Neural network first set

Case Processing Summary


N
Sample

Training
Testing

Valid
Excluded

Percent
185

99.5%

0.5%

186

100.0%

82

Total

268

Network Information
Input Layer

Factors

Covariates

Open

High

Low

Volume

Number of Unitsa

513

Rescaling Method for Covariates


Hidden Layer(s)

Standardized

Number of Hidden Layers

Number of Units in Hidden Layer 1a


Activation Function
Output Layer

Dependent Variables

19
Hyperbolic tangent

Close

Number of Units

Rescaling Method for Scale Dependents

Standardized

Activation Function

Identity

Error Function

Sum of Squares

a. Excluding the bias unit

Model Summary
Training

Sum of Squares Error

8.292

Relative Error

.090

Stopping Rule Used

1 consecutive step(s)
with no decrease in
errora

Training Time
Testing

0:00:01.79

Sum of Squares Error

4.992E-5
.b

Relative Error
Dependent Variable: Close
a. Error computations are based on the testing sample.
b. Cannot be computed. The dependent variable may be
constant in the testing sample.

Neural network predictions v/s closing values

Share

Close

3000
2500
2000
1500
1000
500
0

Predicted
value

Close
Predicted value

Error

2172
2219.
9

2271.09

2.1081
7

2216

2227.71

0.5284
3

2258.
1

2273.12

2224.
2

0.6651

6
2325
2314.
9

2312.09

0.1213
88

2307.73

0.1890
06

2346.95

3.7326

2197.42

2.8549
96

2215.32

0.4931
95

2112.65

3.0605
46

2316.77

1.4791
9

2310.06

1.2873
2

2380.18

1.0992
7

2424

2435.74

0.4843
2

2423

2471.46

-2

2440.51

0.6478
9

2419.97

1.5940
4

2359.91

1.4795
33

2312.
1
2279.
65
2262.
5
2262
2226.
3
2179.
35

2283
2280.
7
2354.
3

2424.
8

2382
2395.
35

2386.
1

2313.48

3.0434
6

2345

2301.51

1.8545
84

2309.12

0.3400
95

2220.04

3.0550
22

2280

2291.05

0.4846
5

2288

2280.15

0.3430
94

2265.
25

2281.63

0.7231

2280.
05

2255.32

1.0846
25

2294.99

0.6574
6

2362.
8

2325.7

1.5701
71

2400.
8

2389.5

0.4706
76

2368.62

0.4363
18

2382

2407.46

1.0688
5

2349.
85

2328.05

0.9277
19

2317
2301.
7
2290
2258

2280

2280
2317

2379

2388.88

0.6692

2447.16

1.3694
5

2403.41

1.0942
39

2449.89

0.0220
5

2476.34

1.6977
4

2378.91

1.1875
39

2380.
95

2385.52

0.1919
4

2370

2332.11

1.5987

2373
2414.
1
2430
2421.
7
2449.
35
2456.
95

2435
2418.
8
2407
2443
2409.
8
2434.
1
2407.
5
2379.
5
2375
2383.
15

34

2394.26

2.9235
9

2323.15

0.0926
3

2299.11

1.3257
51

2296.1

2.2936
17

2363.94

0.1292
78

2343.24

1.2769
9

2334.48

0.8692
33

2437.27

2.0077

2394.
5

2419.5

1.0440
6

2367

2348.69

0.7735
53

2349.
65

2335.56

0.5996
64

2390

2349.08

1.7121
34

2396

2395.56

0.0183
64

2472

2517.53

2326.
25

2321
2330
2350
2367
2359.
75
2373.
55
2354.
95
2346
2366
2389.
3

2376

1.8418

2455

2474

2486.43

1.2802
4

2496.68

0.9167
3

2483.64

0.0547
28

2505.
2
2485
2526.
75

100

2538.
85

2497.26

1.6381
43

2500

2542.88

1.7152

2496

2492.83

0.1270
03

2528.
9

2563.76

1.3784
6

2540

2517.14

0.9

2578.65

1.6437
1

2482.35

1.7280
29

2526.59

0.1343
5

2518

2528

2536.
95
2526
2523.
2
2493.
95
2472

2468.
8
2480.
9

2479.7

0.0483
7

2578.91

0.3069
37

2598.38

0.0261
8

2604.72

0.9612
17

2548.39

3.1729
93

2658.82

0.1438
8

2747.06

2.2656
5

2729.9

0.3676
6

2600

2640.1

1.5423
1

2579

2571.31

0.2981
78

2501.14

0.8487
44

2586.
85
2597.
7
2630
2631.
9

2655
2686.
2
2719.
9
2641.
1
2585
2592

2534.
6
2522.
55
2555.
65

2555.31

0.4051
1

2577.32

1.1507
1

2537.
25

2544.04

0.2676
1

2548

2543.85

0.1628
73

2562.69

0.5055
3

2443.94

3.7667
35

2532.43

0.4547
96

2600

2604.53

0.1742
3

2533.
2

2559.15

1.0244

2632.61

2.2372
8

2499.64

2.4759
08

2579.
8

2546.2

1.3024
27

2564.

2548.07

0.6522

2545

2548

2549.
8
2550
2551
2540.
3
2539.
6
2544
2606

2575
2563.
1
2567

93

2586
2671

2659.42

0.4335
45

2668.61

0.6437
32

2731.33

1.4591
1

2619.62

0.7099
13

2644.94

0.9385
77

2579.04

0.7831
2

2607.86

3.8988

2548.56

1.6172
2

2501.8

0.3842
4

2686.
5
2727
2745.
95
2685.
9
2692.
05
2638.
35
2670
2600.
8
2575

2559
2569.
3
2544.
55
2510

2508
2511.
45

2472.
8
2515
2509.
7

2561.89

2.0795
3

2501.
15

2491.34

0.3922
2

2488.15

0.7558
53

2492.
05

2536.76

1.7941
1

2530.
9

2530.49

0.0162

2530.05

1.2355
08

2563.
25

2614.78

2.0103
4

2534.
9

2473.42

2.4253
42

2562.16

0.5557
3

2506.48

0.3053
4

2502.82

0.1308
81

2429.41

1.5237
13

2495.65

1.3323
58

2507.
1

2561.
7
2579.
95

2548
2498.
85
2506.
1
2467
2529.
35

2591.
7

2548.18

1.6792
07

2623.
9

2594.14

1.1341
9

2584.5

1.7188
27

2610.81

0.2961
9

2582

2616.15

1.3226
2

2600

2584.53

0.595

2573.63

0.7291
6

2599

2614.95

0.6137

2590.
1

2627.76

-1.454

2615

2.4686
5

2528.72

1.5564
29

2605.64

0.2824
9

2472.91

3.0231
37

2517.19

0.1907
22

2512

2524.33

0.4908
4

2508

2486.65

0.8512

2629.
7
2603.
1

2555

2552
2568.
7
2585.
9
2598.
3
2550
2522

76

2552.23

1.7027
3

2564.
25

2614.95

1.9771
9

2600

2599.48

0.02

2622.05

1.7284
2

2669.22

2.1906
6

2596.69

0.3190
02

2668.01

2.0193
5

2583.79

1.2274
93

2600.15

0.3773
95

2614.
95

2630.15

0.5812
7

2600.
05

2594.25

0.2230
73

2621.

2608.59

0.5038

2509.
5
2520

2561.
95
2580
2577.
5
2600.
2

2612
2605
2615.
2
2621.
55
2615.
9
2610

52

2551.
9
2501.73

2.1615
17

2547.3

1.4052
5

2456.72

2.3231
22

2519.18

0.3257
7

2481.76

0.6898
76

2578.32

2.1885
8

2585.51

2.1940
7

2521.88

1.1027
45

2474.03

0.0795
64

2481

2491.84

0.4369
2

2493

2463.83

1.1700

2557

2512
2515.
15

2511
2499
2523.
1
2523.
95
2559

2530
2550
2463
2519
2482.
25
2476

76
2500

2519.17

0.7668

2509.
6

2494.87

0.5869
46

2486.34

1.2765
8

2408.81

2.5010
12

2452.91

0.8484
58

2654.63

0.0293
9

2670.48

0.7310
2

2584.17

0.1846
31

2498.46

1.7920
25

2459.9

2.3810

2455
2470.
6
2424
2426
2473.
9
2574
2618.
95
2653.
85
2651.
1
2662.
6
2646
2588.
95
2548.
3
2544.
05
2560
2554.
2
2519.

47

2515

2493.72

0.8461
23

2569.
85

2552.99

0.6560
69

2523.52

2.9415
38

2637.11

0.8069
6

2602

2553.36

1.8693
31

2556

2536.82

0.7503
91

2582.76

0.0344
47

2571.7

0.4334
9

2649.48

2.4943
9

2618.
2

2592.29

0.9896
11

2605.
15

2596.48

0.3328
02

2638.
85

2597.92

1.5510
54

2649

2626.42

0.8523
97

2702.37

0.4561
2

2678.89

2.2302
92

2600

2616
2610.
85

2583.
65
2560.
6

2585

2690.
1
2740

2785.
1

2754.63

1.0940
36

2616.32

1.2523
12

2756.99

3.2696
6

2744.92

1.6637

2685.7

0.7899
7

2598.3

0.6429
9

2505.17

1.3712
6

2460

2455.61

0.1784
55

2445.
5

2427.93

0.7184
62

2576.67

0.0221
3

2673.
95
2679.
55
2668
2649.
5
2669.
7
2699.
35
2700
2664.
65
2674.
55
2640
2581.
7
2540
2469

2512
2576.
1

2553

2529.43

0.9232
28

2532.98

0.7550
2

2514.
95
2552.
25
2516

Multi layer with 70/20/10


Case Processing Summary
N
Sample

Training

Percent
192

99.5%

0.5%

193

100.0%

Testing
Valid
Excluded

75

Total

268

Network Information
Input Layer

Factors

Covariates
Number of Units

Open

High

Low

Volume

534

Rescaling Method for Covariates


Hidden Layer(s)

Standardized

Number of Hidden Layers

Number of Units in Hidden Layer 1

Activation Function
Output Layer

Dependent Variables

19
Hyperbolic tangent

Close

Number of Units

Rescaling Method for Scale Dependents

Standardized

Activation Function

Identity

Error Function

Sum of Squares

a. Excluding the bias unit

Model Summary
Training

Sum of Squares Error

88.736

Relative Error
Stopping Rule Used

.929
1 consecutive step(s)
with no decrease in
errora

Training Time
Testing

Sum of Squares Error

0:00:01.93
1.058E-7

Relative Error
Dependent Variable: Close
a. Error computations are based on the testing sample.
b. Cannot be computed. The dependent variable may be
constant in the testing sample.

.b

Vous aimerez peut-être aussi