nimble.trainAndTest¶
- nimble.trainAndTest(learnerName, performanceFunction, trainX, trainY=None, testX=None, testY=None, arguments=None, multiClassStrategy=None, randomSeed=None, tuning=None, *, useLog=None, **kwarguments)¶
Train a model and get the results of its performance.
The
performanceFunction
is used to calculate a metric that compares the known data to the predicted data generated after training the model. A performance function is a specialized function that allows Nimble to extract the correct prediction data and analyze and compare values returned by the function. Most common performance functions are available in nimble.calculate or see nimble.calculate.performanceFunction for support on creating a customized performance function.- Parameters:
learnerName (str) – The learner to be called. This can be a string in the form ‘package.learner’ or the learner class object.
performanceFunction (function) – The function used to determine the performance of the learner. Pre-made functions are available in nimble.calculate. If hyperparameter tuning and the Tuning instance does not have a set performanceFunction, it will utilize this function as well.
trainX (nimble Base object) – Data to be used for training.
trainY (identifier, nimble Base object) –
identifier - The name or index of the feature in
trainX
containing the labels.nimble Base object - contains the labels that correspond to
trainX
.
testX (nimble Base object) – Data to be used for testing.
testY (identifier, nimble Base object) –
identifier - A name or index of the feature in
testX
containing the labels.nimble Base object - contains the labels that correspond to
testX
.
arguments (dict) – Mapping argument names (strings) to their values, to be used during training and application (e.g., {‘dimensions’:5, ‘k’:5}). Multiple values for arguments can be provided by using a
Tune
object (e.g., {‘k’: Tune([3, 5, 7])}) to initiate hyperparameter tuning and return the learner trained on the best set of arguments. To provide an argument that is an object from the same package as the learner, use animble.Init
object with the object name and its instantiation arguments (e.g., {‘optimizer’: nimble.Init(‘SGD’, learning_rate=0.01}). Note: learner arguments can also be passed askwarguments
so this dictionary will be merged with any keyword arguments.multiClassStrategy (str, None) – May be ‘OneVsAll’ or ‘OneVsOne’ to train the learner using that multiclass strategy. When None, the learner is trained on the data as provided.
randomSeed (int) – Set a random seed for the operation. When None, the randomness is controlled by Nimble’s random seed. Ignored if learner does not depend on randomness.
tuning (nimble.Tuning, performanceFunction, None) – Applies when hyperparameter tuning is initiated by
Tune
objects in the arguments. A Tuning instance details how the argument sets will be selected and validated. For convenience, a performanceFunction may instead be provided or None will provide the performanceFunction from this function and this will trigger construction of a Tuning instance using the default consecutive selection method with 5-fold cross validation.useLog (bool, None) – Local control for whether to send results/timing to the logger. If None (default), use the value as specified in the “logger” “enabledByDefault” configuration option. If True, send to the logger regardless of the global option. If False, do NOT send to the logger, regardless of the global option.
kwarguments – Keyword arguments specified variables that are passed to the learner. These are combined with the
arguments
parameter. Multiple values for arguments can be provided by using aTune
object (e.g., k=Tune([3, 5, 7])) to initiate hyperparameter tuning and return the learner trained on the best set of arguments. To provide an argument that is an object from the same package as the learner, use animble.Init
object with the object name and its instantiation arguments (e.g., optimizer=nimble.Init(‘SGD’, learning_rate=0.01)).
- Returns:
performance – The calculated value of the
performanceFunction
after the test.
See also
train
,trainAndTestOnTrainingData
,nimble.Init
,nimble.Tune
,nimble.Tuning
,nimble.core.interfaces.TrainedLearner.test
,nimble.calculate.performanceFunction
Examples
Train and test datasets which contains the labels.
>>> lstTrain = [[1, 0, 0, 1], ... [0, 1, 0, 2], ... [0, 0, 1, 3], ... [1, 0, 0, 1], ... [0, 1, 0, 2], ... [0, 0, 1, 3]] >>> lstTest = [[1, 0, 0, 1], [0, 1, 0, 2], [0, 0, 1, 3]] >>> ftNames = ['a', 'b', 'c', 'label'] >>> trainData = nimble.data(lstTrain, ... featureNames=ftNames) >>> testData = nimble.data(lstTest, featureNames=ftNames) >>> perform = nimble.trainAndTest( ... 'nimble.KNNClassifier', nimble.calculate.fractionIncorrect, ... trainX=trainData, trainY='label', testX=testData, ... testY='label') >>> perform 0.0
Passing arguments to the learner. Both the arguments parameter and kwarguments can be utilized, they will be merged. Below,
C
andkernel
are parameters for scikit-learn’s SVC learner.>>> lstTrainX = [[1, 0, 0], ... [0, 1, 0], ... [0, 0, 1], ... [1, 0, 0], ... [0, 1, 0], ... [0, 0, 1]] >>> lstTrainY = [[1], [2], [3], [1], [2], [3]] >>> lstTestX = [[1, 0, 0], [0, 1, 0], [0, 0, 1]] >>> lstTestY = [[1], [2], [3]] >>> trainX = nimble.data(lstTrainX) >>> trainY = nimble.data(lstTrainY) >>> testX = nimble.data(lstTestX) >>> testY = nimble.data(lstTestY) >>> perform = nimble.trainAndTest( ... 'sciKitLearn.SVC', nimble.calculate.fractionIncorrect, ... trainX=trainX, trainY=trainY, testX=testX, testY=testY, ... arguments={'C': 0.1}, kernel='linear') >>> perform 0.0
Keywords: performance, testing, supervised, score, model, training, machine learning, predict, error, measure, accuracy, performance