The NICO Toolkit


Although it is mainly intended for, and originally developed for speech recognition applications, the NICO toolkit is a general purpose toolkit for constructing artificial neural networks and training with the back-propagation learning algorithm. The network topology is very flexible. Units are organized in groups and the group is a hierarchical structure, so groups can have sub-groups or other objects as members. This makes it easy to specify multi-layer networks with arbitrary connection structure and to build modular networks.

Recurrency and time-delays are constrained only by the condition that the activation of a unit cannot be dependent on its own activity at the present or future times (this is checked automatically every time the user connects between two groups or units). A convenient feature is that the user can specify not only time-delay connections but also look-ahead connections. All connections are converted to time-delays when the network runs, but this is completely hidden from the user.

The NICO toolkit was developed to solve a particular problem (phoneme probability estimation for automatic speech recognition) and the focus was not on research in the field of artificial neural networks. There are no fancy graphical tools to monitor network behavior or characteristics . But if you want to solve a real problem with a large database (millions of samples) and possibly large networks (e.g. 500.000 connections or more), then this toolkit is probably still a good choice as it is optimized for fast training and evaluation on large networks.

The communication with external data is handled conveniently by stream objects in the network. A network may have multiple input and output streams. The streams perform the normalization of the data to values in the range suitable for ANN computation. Streams can read data in several different file formats.

The only training algorithm available is a fast implementation of the back-propagation learning algorithm. It is optimized for high performance back-propagation through time, i.e., for training of networks with recurrency and/or time-delay windows. The reason that none of the theoretically more advanced algorithms (like quasi-newton, conj. grad. etc.) are available is that the NICO toolkit was developed for training large networks with the training typically terminated before the minimum error is reached (to prevent over-training) and weight updating many times per epoch. For this type of problems, back-propagation with momentum is a good choice.

Units of several different types can be arbitrarily mixed in a network. Some of the unit types are:

  • Sigmoid units
  • Tanhyp units
  • Linear units
  • Multiplication units
  • Exponential units ex
  • Inverter units 1/x
  • Environment units (takes the activatiystem environment)
  • File filter units (boolean unit that checks the filename of the input data)

and other unit types with for example x2,sigma-pi, softmax, and radial basis activation functions can be modeled by combining the primitive unit types above.

Many different error functions are supported and can be arbitrarily mixed in a network. Some of the error functions are:

  • Mean square error
  • Cross entropy
  • Absolute value



Q: Under what licence is the NICO Toolkit distributed?


Q: What platforms are supported?

A: Really anywhere gcc runs. No binaries are distributed so you do need to compile it yourself. I only test on Linux, but I know others are running it on all flavors of UNIX.


The NICO (Neural Inference COmputation) tool-kit was initially developed during the years 1993-97 by myself, Nikko Str&omul;m, at the Department for Speech, Music, and Hearing at KTH, Stockholm, Sweden. Many thanks to Björn Granström and Rolf Carlsson at the department for the opportunity to work on the project. I am also indebted to all the staff and researchers at the department for encouragement and guidance, in particular Kjell Elenius and Mats Blomgren.

Over the years the NICO toolkit has been downloaded thousands of times, mostly by researchers and students in the field of Automatic Speech Recognition. Many have provided valuable feedback which has contributed greatly to the relative rubustness of the current code. The intention is to continue and accelerate on that path by hosting the code here at SourceForge under the BSD license.


The downloaded file is an archive including the source code of the toolkit and a local copy of the html-documentation. After uncompressing and extracting the archive, using the commands:

tar xfz nico_v1.1.2.tar.gz

you get the following file-structure.

           +-- bin
           +-- doc
           +-- lib
           +-- tools
           +-- toy-examples
           +-- speech-example

First build the library:

cd lib


Then cd to the tools-directory and compile all tools:

cd ../tools


If nothing went wrong you should now have a set of executable files in the bin-directory. You may want to alter your $PATH environment variable to include this bin-directory, or copy the executables to your /usr/local/bin.

In the doc-directory you have your local copy of the manual. The top-level page is doc/index.html