Skip to content

Some differences I've noted from the Torch WRN implementation #3

@RaananHadar

Description

@RaananHadar

Hi,
2 things:

  1. Using the 26 layer code, i've reached 93% accuracy on cifar-10. Which is the same as you currently had for the 40 layer version.
  2. re: the dataset used in the torch implementation. Please note that it uses a dataset that was already prewhitened and uses a global contrast normalization with a scale value of 55.

As a final note, I would suggest that you normalize the data by 255 so it would be from 0 to 1. Please take note that I have tried training on such a normalized set, It did not improve the performance significantly. The main difference from the torch implementation seems to be in the augmentation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions