ica                  package:e1071                  R Documentation

_I_n_d_e_p_e_n_d_e_n_t _C_o_m_p_o_n_e_n_t _A_n_a_l_y_s_i_s

_D_e_s_c_r_i_p_t_i_o_n:

     This is an R-implementation of the Matlab-Function of
     Petteri.Pajunen@hut.fi.

     For a data matrix X independent components are extracted by
     applying a nonlinear PCA algorithm. The parameter 'fun' determines
     which nonlinearity is used. 'fun' can either be a function or one
     of the following strings "negative kurtosis", "positive kurtosis",
     "4th moment" which can be abbreviated to uniqueness. If 'fun'
     equals "negative (positive) kurtosis" the function tanh
     (x-tanh(x)) is used which provides ICA for sources with negative
     (positive) kurtosis. For 'fun == "4th moments"' the signed square
     function is used.

_U_s_a_g_e:

     ica(X, lrate, epochs=100, ncomp=dim(X)[2], fun="negative")

_A_r_g_u_m_e_n_t_s:

       X: The matrix for which the ICA is to be computed

   lrate: learning rate

  epochs: number of iterations

   ncomp: number of independent components

     fun: function used for the nonlinear computation part

_V_a_l_u_e:

     An object of class '"ica"' which is a list with components 

 weights: ICA weight matrix

projection: Projected data

  epochs: Number of iterations

     fun: Name of the used function

   lrate: Learning rate used

initweights: Initial weight matrix

_N_o_t_e:

     Currently, there is no reconstruction from the ICA subspace to the
     original input space.

_A_u_t_h_o_r(_s):

     Andreas Weingessel

_R_e_f_e_r_e_n_c_e_s:

     Oja et al., ``Learning in Nonlinear Constrained Hebbian
     Networks'', in Proc. ICANN-91, pp. 385-390.

     Karhunen and Joutsensalo, ``Generalizations of Principal Component
     Analysis, Optimization Problems, and Neural Networks'', Neural
     Networks, v. 8, no. 4, pp. 549-562, 1995.

