nimble.train¶
- nimble.train(learnerName, trainX, trainY=None, arguments=None, multiClassStrategy=None, randomSeed=None, tuning=None, *, useLog=None, **kwarguments)¶
Train a specified learner using the provided data.
- Parameters:
learnerName (str) – The learner to be called. This can be a string in the form ‘package.learner’ or the learner class object.
trainX (nimble Base object) – Data to be used for training.
trainY (identifier, nimble Base object) – A name or index of the feature in
trainX
containing the labels or another nimble Base object containing the labels that correspond totrainX
.arguments (dict) – Mapping argument names (strings) to their values, to be used during training and application (e.g., {‘dimensions’:5, ‘k’:5}). Multiple values for arguments can be provided by using a
Tune
object (e.g., {‘k’: Tune([3, 5, 7])}) to initiate hyperparameter tuning and return the learner trained on the best set of arguments. To provide an argument that is an object from the same package as the learner, use animble.Init
object with the object name and its instantiation arguments (e.g., {‘optimizer’: nimble.Init(‘SGD’, learning_rate=0.01}). Note: learner arguments can also be passed askwarguments
so this dictionary will be merged with any keyword arguments.multiClassStrategy (str, None) – May be ‘OneVsAll’ or ‘OneVsOne’ to train the learner using that multiclass strategy. When None, the learner is trained on the data as provided.
randomSeed (int) – Set a random seed for the operation. When None, the randomness is controlled by Nimble’s random seed. Ignored if learner does not depend on randomness.
tuning (nimble.Tuning, performanceFunction, None) – Required when hyperparameter tuning is initiated by
Tune
objects in the arguments. A Tuning instance details how the argument sets will be selected and validated. For convenience, a performanceFunction may instead be provided and this will trigger construction of a Tuning instance using the default consecutive selection method with 5-fold cross validation.useLog (bool, None) – Local control for whether to send results/timing to the logger. If None (default), use the value as specified in the “logger” “enabledByDefault” configuration option. If True, send to the logger regardless of the global option. If False, do NOT send to the logger, regardless of the global option.
kwarguments – Keyword arguments specified variables that are passed to the learner. These are combined with the
arguments
parameter. Multiple values for arguments can be provided by using aTune
object (e.g., k=Tune([3, 5, 7])) to initiate hyperparameter tuning and return the learner trained on the best set of arguments. To provide an argument that is an object from the same package as the learner, use animble.Init
object with the object name and its instantiation arguments (e.g., optimizer=nimble.Init(‘SGD’, learning_rate=0.01)).
- Returns:
TrainedLearner
See also
trainAndApply
,trainAndTest
,trainAndTestOnTrainingData
,nimble.core.interfaces.TrainedLearner
,nimble.Init
,nimble.Tune
,nimble.Tuning
Examples
A single dataset which contains the labels.
>>> lst = [[1, 0, 0, 1], ... [0, 1, 0, 2], ... [0, 0, 1, 3], ... [1, 0, 0, 1], ... [0, 1, 0, 2], ... [0, 0, 1, 3]] >>> ftNames = ['a', 'b' ,'c', 'label'] >>> trainData = nimble.data(lst, featureNames=ftNames) >>> tl = nimble.train('nimble.KNNClassifier', trainX=trainData, ... trainY='label') >>> print(type(tl)) <class 'nimble.core.interfaces.universal_interface.TrainedLearner'>
Passing arguments to the learner. Both the arguments parameter and kwarguments can be utilized, they will be merged. Below,
C
andkernel
are parameters for scikit-learn’s SVC learner.>>> lstX = [[1, 0, 0], ... [0, 1, 0], ... [0, 0, 1], ... [1, 0, 0], ... [0, 1, 0], ... [0, 0, 1]] >>> lstY = [[1], [2], [3], [1], [2], [3]] >>> trainX = nimble.data(lstX) >>> trainY = nimble.data(lstY) >>> tl = nimble.train('sciKitLearn.SVC', trainX=trainX, trainY=trainY, ... arguments={'C': 0.1}, kernel='linear') >>> tlAttributes = tl.getAttributes() >>> cValue = tlAttributes['C'] >>> kernelValue = tlAttributes['kernel'] >>> print(cValue, kernelValue) 0.1 linear
Keywords: learn, model, regression, classification, neural network, clustering, supervised, unsupervised, deep learning, fit, training, machine learning