`
deepfuture
  • 浏览: 4400351 次
  • 性别: Icon_minigender_1
  • 来自: 湛江
博客专栏
073ec2a9-85b7-3ebf-a3bb-c6361e6c6f64
SQLite源码剖析
浏览量:80073
1591c4b8-62f1-3d3e-9551-25c77465da96
WIN32汇编语言学习应用...
浏览量:70040
F5390db6-59dd-338f-ba18-4e93943ff06a
神奇的perl
浏览量:103346
Dac44363-8a80-3836-99aa-f7b7780fa6e2
lucene等搜索引擎解析...
浏览量:285798
Ec49a563-4109-3c69-9c83-8f6d068ba113
深入lucene3.5源码...
浏览量:15012
9b99bfc2-19c2-3346-9100-7f8879c731ce
VB.NET并行与分布式编...
浏览量:67554
B1db2af3-06b3-35bb-ac08-59ff2d1324b4
silverlight 5...
浏览量:32147
4a56b548-ab3d-35af-a984-e0781d142c23
算法下午茶系列
浏览量:45986
社区版块
存档分类
最新评论

matlab-神经网络-自定义多层感知器解决异或(2)

 
阅读更多

继续定义单元神经元

 

net.inputs{i}.range


This property defines the range of each element of the ith network input.

It can be set to any Ri x 2 matrix, where Ri is the number of elements in the input (net.inputs{i}.size), and each element in column 1 is less than the element next to it in column 2.

Each jth row defines the minimum and maximum values of the jth input element, in that order:
net.inputs{i}(j,:)

 

Uses.   Some initialization functions use input ranges to find appropriate initial values for input weight matrices.

Side Effects.   Whenever the number of rows in this property is altered, the input size, processedSize, and processedRange change to remain consistent. The sizes of any weights coming from this input and the dimensions of the weight matrices also change.

 

>> net.inputs{1}.range=[0 1;0 1]

net =

    Neural Network object:

    architecture:

         numInputs: 1
         numLayers: 2
       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

        numOutputs: 1  (read-only)
    numInputDelays: 0  (read-only)
    numLayerDelays: 0  (read-only)

    subobject structures:

            inputs: {1x1 cell} of inputs
            layers: {2x1 cell} of layers
           outputs: {1x2 cell} containing 1 output
            biases: {2x1 cell} containing 2 biases
      inputWeights: {2x1 cell} containing 1 input weight
      layerWeights: {2x2 cell} containing 1 layer weight

    functions:

          adaptFcn: (none)
         divideFcn: (none)
       gradientFcn: (none)
           initFcn: (none)
        performFcn: (none)
          plotFcns: {}
          trainFcn: (none)

    parameters:

        adaptParam: (none)
       divideParam: (none)
     gradientParam: (none)
         initParam: (none)
      performParam: (none)
        trainParam: (none)

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    other:

              name: ''
          userdata: (user information)

>>

======

 

 

net.layers{i}.size


This property defines the number of neurons in the ith layer. It can be set to 0 or a positive integer.

Side Effects.   Whenever this property is altered, the sizes of any input weights going to the layer (net.inputWeights{i,:}.size), any layer weights going to the layer (net.layerWeights{i,:}.size) or coming from the layer (net.inputWeights{i,:}.size), and the layer's bias (net.biases{i}.size), change.

The dimensions of the corresponding weight matrices (net.IW{i,:}, net.LW{i,:}, net.LW{:,i}), and biases (net.b{i}) also change.

Changing this property also changes the size of the layer's output (net.outputs{i}.size) and target (net.targets{i}.size) if they exist.

Finally, when this property is altered, the dimensions of the layer's neurons (net.layers{i}.dimension) are set to the same value. (This results in a one-dimensional arrangement of neurons. If another arrangement is required, set the dimensions property directly instead of using size.

======= 

>> net.layers{1}.size=2

net =

    Neural Network object:

    architecture:

         numInputs: 1
         numLayers: 2
       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

        numOutputs: 1  (read-only)
    numInputDelays: 0  (read-only)
    numLayerDelays: 0  (read-only)

    subobject structures:

            inputs: {1x1 cell} of inputs
            layers: {2x1 cell} of layers
           outputs: {1x2 cell} containing 1 output
            biases: {2x1 cell} containing 2 biases
      inputWeights: {2x1 cell} containing 1 input weight
      layerWeights: {2x2 cell} containing 1 layer weight

    functions:

          adaptFcn: (none)
         divideFcn: (none)
       gradientFcn: (none)
           initFcn: (none)
        performFcn: (none)
          plotFcns: {}
          trainFcn: (none)

    parameters:

        adaptParam: (none)
       divideParam: (none)
     gradientParam: (none)
         initParam: (none)
      performParam: (none)
        trainParam: (none)

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    other:

              name: ''
          userdata: (user information)

>>

 

 

=====

net.layers{i}.initFcn


This property defines which of the layer initialization functions are used to initialize the ith layer, if the network initialization function (net.initFcn) is initlay. If the network initialization is set to initlay, then the function indicated by this property is used to initialize the layer's weights and biases.

For a list of functions, type
help nninit

=====

 

>> net.layers{1}.initFcn='initnw'

net =

    Neural Network object:

    architecture:

         numInputs: 1
         numLayers: 2
       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

        numOutputs: 1  (read-only)
    numInputDelays: 0  (read-only)
    numLayerDelays: 0  (read-only)

    subobject structures:

            inputs: {1x1 cell} of inputs
            layers: {2x1 cell} of layers
           outputs: {1x2 cell} containing 1 output
            biases: {2x1 cell} containing 2 biases
      inputWeights: {2x1 cell} containing 1 input weight
      layerWeights: {2x2 cell} containing 1 layer weight

    functions:

          adaptFcn: (none)
         divideFcn: (none)
       gradientFcn: (none)
           initFcn: (none)
        performFcn: (none)
          plotFcns: {}
          trainFcn: (none)

    parameters:

        adaptParam: (none)
       divideParam: (none)
     gradientParam: (none)
         initParam: (none)
      performParam: (none)
        trainParam: (none)

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    other:

              name: ''
          userdata: (user information)

>>

 

>> net.layers{2}.size=1

>> net.layers{2}.initFcn='initnw'

>> net.layers{2}.transferFcn='hardlim'

net =

    Neural Network object:

    architecture:

         numInputs: 1
         numLayers: 2
       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

        numOutputs: 1  (read-only)
    numInputDelays: 0  (read-only)
    numLayerDelays: 0  (read-only)

    subobject structures:

            inputs: {1x1 cell} of inputs
            layers: {2x1 cell} of layers
           outputs: {1x2 cell} containing 1 output
            biases: {2x1 cell} containing 2 biases
      inputWeights: {2x1 cell} containing 1 input weight
      layerWeights: {2x2 cell} containing 1 layer weight

    functions:

          adaptFcn: (none)
         divideFcn: (none)
       gradientFcn: (none)
           initFcn: (none)
        performFcn: (none)
          plotFcns: {}
          trainFcn: (none)

    parameters:

        adaptParam: (none)
       divideParam: (none)
     gradientParam: (none)
         initParam: (none)
      performParam: (none)
        trainParam: (none)

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    other:

              name: ''
          userdata: (user information)

>>

=

 

 

net.layers{i}.transferFcn


This function defines which of the transfer functions is used to calculate the ith layer's output, given the layer's net input, during simulation and training.

For a list of functions type: help nntransfer

=

>> net.adapFcn='trans'

===

 

net.adaptFcn


This property defines the function to be used when the network adapts. It can be set to the name of any network adapt function. The network adapt function is used to perform adaption whenever adapt is called.
[net,Y,E,Pf,Af] = adapt(NET,P,T,Pi,Ai)

 

For a list of functions type help nntrain.

Side Effects.   Whenever this property is altered, the network's adaption parameters (net.adaptParam) are set to contain the parameters and default values of the new function.

===

>> net.adaptFcn='trains'

net =

    Neural Network object:

    architecture:

         numInputs: 1
         numLayers: 2
       biasConnect: [1; 1]
      inputConnect: [1; 0]
      layerConnect: [0 0; 1 0]
     outputConnect: [0 1]

        numOutputs: 1  (read-only)
    numInputDelays: 0  (read-only)
    numLayerDelays: 0  (read-only)

    subobject structures:

            inputs: {1x1 cell} of inputs
            layers: {2x1 cell} of layers
           outputs: {1x2 cell} containing 1 output
            biases: {2x1 cell} containing 2 biases
      inputWeights: {2x1 cell} containing 1 input weight
      layerWeights: {2x2 cell} containing 1 layer weight

    functions:

          adaptFcn: 'trains'
         divideFcn: (none)
       gradientFcn: (none)
           initFcn: (none)
        performFcn: (none)
          plotFcns: {}
          trainFcn: (none)

    parameters:

        adaptParam: .passes
       divideParam: (none)
     gradientParam: (none)
         initParam: (none)
      performParam: (none)
        trainParam: (none)

    weight and bias values:

                IW: {2x1 cell} containing 1 input weight matrix
                LW: {2x2 cell} containing 1 layer weight matrix
                 b: {2x1 cell} containing 2 bias vectors

    other:

              name: ''
          userdata: (user information)

>>

==========

 

 

net.performFcn


This property defines the function used to measure the network's performance. You can set it to the name of any of the performance functions. The performance function is used to calculate network performance during training whenever train is called.
[net,tr] = train(NET,P,T,Pi,Ai)

 

For a list of functions, type
help nnperformance

 

Side Effects.   Whenever this property is altered, the network's performance parameters (net.performParam) are set to contain the parameters and default values of the new function

==========

net.performFcn='mse'

======

 

 

======

>> net.trainFcn='trainlm'

分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics