How to use Cleanlab as Pre-Processing step with time consuming model training? #69
Replies: 2 comments 4 replies
-
OK i found that training on clean samples is done at the end of the self.clf.fit(x_pruned, s_pruned, sample_weight=self.sample_weight) I think it can make sense to filter out the noisy samples with a time efficient training routine, like
And afterwards do the real time consuming training with clean labels on bigger images / more epochs / full CNN. Does that make sense? |
Beta Was this translation helpful? Give feedback.
-
I was wondering the same thing with regard to using models which are time-consuming to train. In the paper (pg. 12 / sec. 5.1) the authors explain their training procedure for the CIFAR-10 experiment
Training ResNet-50 for 350 epochs has got to be fairly time-consuming even on a high-end GPU, so I am curious @cgnorthcutt did you repeat this entire process for each cross validation fold? If so, how many folds were used in total? I could not find those details in the paper, only this mention a few sentences later
Are there alternative CL strategies besides cross validation? For instance
In this approach, I think the estimate of |
Beta Was this translation helpful? Give feedback.
-
Hello everyone,
i am just getting started with Cleanlab and really enjoy it so far! Thank you for the awesome work first of all.
I am currently trying to integrate the
LearningWithNoisyLabels(clf=MyPytorchModel())
into my Pytorch Training.My idea was to
The following is now unclear to me:
lnl.noise_mask
to drop the noisy labels?Beta Was this translation helpful? Give feedback.
All reactions