libauc.models

This module aims to provide several popular deep neural network implementations collected and adapted from public codebases. We recommend users to cite the original papers when using these models. Here is an overview of this module:

Model

Reference

DenseNet: DenseNet

huang2017densely

ResNet: ResNet

he2016deep

ResNet: ResNet (Cifar version)

he2016deep

NeuMF: NeuMF

he2017neural

MLP: MultiLayer Perceptron

yuan2023libauc (our implementation)

Please refer to the source code for more details about each implementation.

libauc.models.densenet

class DenseNet(growth_rate=32, block_config=(6, 12, 24, 16), num_init_features=64, bn_size=4, drop_rate=0.0, num_classes=1, memory_efficient=False, last_activation=None)[source]

Densenet-BC model class, based on “Densely Connected Convolutional Networks”

Parameters:
  • growth_rate (int) – how many filters to add each layer (k in paper)

  • block_config (list of 4 ints) – how many layers in each pooling block

  • num_init_features (int) – the number of filters to learn in the first convolution layer

  • bn_size (int) – multiplicative factor for number of bottle neck layers (i.e. bn_size * k features in the bottleneck layer)

  • drop_rate (float) – dropout rate after each dense layer

  • num_classes (int) – number of classification classes

  • memory_efficient (bool) – If True, uses checkpointing. Much more memory efficient, but slower. Default: False. See “paper”

densenet121(pretrained=False, progress=True, activations='relu', **kwargs)

Densenet-121 model from “Densely Connected Convolutional Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

  • memory_efficient (bool) –

    but slower. Default: False. See “paper”

densenet161(pretrained=False, progress=True, activations='relu', **kwargs)

Densenet-161 model from “Densely Connected Convolutional Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

  • memory_efficient (bool) –

    but slower. Default: False. See “paper”

densenet169(pretrained=False, progress=True, activations='relu', **kwargs)

Densenet-169 model from “Densely Connected Convolutional Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

  • memory_efficient (bool) –

    but slower. Default: False. See “paper”

densenet201(pretrained=False, progress=True, activations='relu', **kwargs)

Densenet-201 model from “Densely Connected Convolutional Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

  • memory_efficient (bool) –

    but slower. Default: False. See “paper”

libauc.models.neumf

class NeuMF(user_num: int, item_num: int, dropout: float = 0.2, emb_size: int = 64, layers: str = '[64]')[source]

NeuMF is a widely-used model for recommender systems.

Parameters:
  • user_num (int) – the number of users in the dataset

  • item_num (int) – the number of items in the dataset

  • dropout (float, optional) – dropout ratio for the model

  • emb_size (int, optional) – embedding size of the model

  • layers (string, optional) – describe the layer information of the model

Reference:
static init_weights(m)[source]
load_model(model_path=None)[source]
reset_last_layer()[source]
save_model(model_path=None)[source]

libauc.models.perceptron

class MLP(input_dim=29, hidden_sizes=(16,), activation='relu', num_classes=1)[source]

An implementation of Multilayer Perceptron (MLP).

libauc.models.resnet

class ResNet(block, layers, num_classes=1, zero_init_residual=False, groups=1, width_per_group=64, replace_stride_with_dilation=None, norm_layer=None, last_activation=None)[source]
resnet101(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNet-101 model from “Deep Residual Learning for Image Recognition”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnet152(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNet-152 model from “Deep Residual Learning for Image Recognition”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnet18(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNet-18 model from “Deep Residual Learning for Image Recognition”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnet34(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNet-34 model from “Deep Residual Learning for Image Recognition”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnet50(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNet-50 model from “Deep Residual Learning for Image Recognition”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnext101_32x8d(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNeXt-101 32x8d model from “Aggregated Residual Transformation for Deep Neural Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

resnext50_32x4d(pretrained=False, progress=True, activations='relu', **kwargs)[source]

ResNeXt-50 32x4d model from “Aggregated Residual Transformation for Deep Neural Networks”

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

wide_resnet101_2(pretrained=False, progress=True, activations='relu', **kwargs)[source]

Wide ResNet-101-2 model from “Wide Residual Networks”

The model is the same as ResNet except for the bottleneck number of channels which is twice larger in every block. The number of channels in outer 1x1 convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 channels, and in Wide ResNet-50-2 has 2048-1024-2048.

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

wide_resnet50_2(pretrained=False, progress=True, activations='relu', **kwargs)[source]

Wide ResNet-50-2 model from “Wide Residual Networks”

The model is the same as ResNet except for the bottleneck number of channels which is twice larger in every block. The number of channels in outer 1x1 convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 channels, and in Wide ResNet-50-2 has 2048-1024-2048.

Parameters:
  • pretrained (bool) – If True, returns a model pre-trained on ImageNet

  • progress (bool) – If True, displays a progress bar of the download to stderr

libauc.models.resnet_cifar

class ResNet(block, num_blocks, num_classes=1, last_activation='sigmoid', pretrained=False)[source]
resnet110(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]
resnet1202(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]
resnet20(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]
resnet32(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]
resnet44(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]
resnet56(pretrained=False, activations='relu', last_activation=None, **kwargs)[source]