TY - EDBOOK AB - Extreme learning machines are single-hidden layer feed- forward neural networks, where the training is restricted to the output weights in order to achieve fast learning with good performance. The success of learning strongly depends on the random parameter initialization. To overcome the problem of unsuited initialization ranges, a novel and efficient pretraining method to adapt extreme learning machines task-specific is presented. The pretraining aims at desired output distributions of the hidden neurons. It leads to better performance and less dependence on the size of the hidden layer. DA - 2011 DO - 10.1007/978-3-642-21735-7_42 KW - CoR-Lab Publication LA - eng PY - 2011 SN - 978-3-642-21734-0 TI - Batch Intrinsic Plasticity for Extreme Learning Machines UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-21419687 Y2 - 2024-11-22T01:17:25 ER -