Features
The main purpose of machine learning framework is
to build abstract models from arbitrary data sources. If an explicit target is
identified (supervised learning), the framework can be used to create a
model for forecasting this target parameter. If no such target parameter
is available (unsupervised learning), the framework can identify related
items and create models that classify new items according to this
segmentation.
New in Version 2
- Grid enabled to do parallel data mining tasks
- Powerful kernel methods for classification and regression
- Integration of user-defined algorithms
- New meta-learning algorithms for feature selection, boosting, and
mixtures of experts
- Easily build models for time series data
Download a detailed description of the new
features of Version 2.
Supervised Analysis
- Decision trees
- FS-ID3 is a fuzzy variant of the ID3 learning algorithm to
create decision trees.
- Rule induction
- FS-FOIL is a fuzzy variant of Quinlan's FOIL method.
- FS-MINER is a proprietary method from SCCH GmbH to find cluster
descriptions.
- Numerical optimization of fuzzy rules
- RENO is a proprietary method from SCCH GmbH that uses
numerical optimization to find computationally accurate and
robust fuzzy rules.
Unsupervised Analysis
- Self-organizing maps
- Create two-dimensional plots of high-dimensional data sets.
- Preprocess large and noisy data sets.
- Recall one or more missing values in the data.
- Fuzzy c-means clustering and ward clustering
- Fuzzy c-means clustering creates a fuzzy segmentation of the data.
- Ward clustering is a crisp, agglomerative clustering method.
Tasks
- Forecasting
- Includes various inference methods to apply created models
onto new cases/samples.
- Classification
- Uses straightforward decision trees and rule-based methods
to forecast the membership of a new sample to a previously defined
set of classes.
- Logical inference
- Includes logical inference methods, such as Sugeno and
Tagaki-Sugeno-Kang controllers, to predict numerical values using rule
bases and decision trees. Self-organizing
maps (SOMs) are also able to predict new values in a
straightforward way.
All of these methods are highly parameterized. The results can be easily
visualized using the Mathematica front end. They can also be modified
using the Mathematica language to fine-tune the models.
| |