Tag red

Neural Network Knowledge Distillation

TensorFlow demonstration of the Knowledge Distillation framework to show how soft labels act as regularizers and a neural network’s knowledge can be transfered to a simpler model

Disjunctive Normal Networks

PyTorch Implementation with sklearn-like API of a Neural Network that constructs a Boolean Function by dividing the feature space with convex polytopes.

Tag yellow

Neural Network Knowledge Distillation

TensorFlow demonstration of the Knowledge Distillation framework to show how soft labels act as regularizers and a neural network’s knowledge can be transfered to a simpler model

Disjunctive Normal Networks

PyTorch Implementation with sklearn-like API of a Neural Network that constructs a Boolean Function by dividing the feature space with convex polytopes.