Publishing House SB RAS:

Publishing House SB RAS:

Address of the Publishing House SB RAS:
Morskoy pr. 2, 630090 Novosibirsk, Russia



Advanced Search

Avtometriya

2020 year, number 1

APPLICABILITY OF MINIFLOATS FOR EFFECTIVE CALCULATIONS IN NEURAL NETWORKS

A. Yu. Kondrat'ev1, A. I. Goncharenko2,1
1Ekspasoft company, Novosibirsk, Russia
2Novosibirsk State University, Novosibirsk, Russia
Keywords: нейронные сети, глубокое обучение, типы данных, minifloat, специализированные вычислители, neural networks, deep learning, data types, minifloat, specialized calculators

Abstract

The possibility of the operation of neural networks on minifloats has been studied. Calculations using a float16 battery for intermediate computing were performed. Performance was tested on GoogleNet, ResNet-50, and MobileNet-v2 convolutional neural and a DeepSpeech-v01 recurrent network. Experiments have shown that the performance of the specified neural networks with 11-bit minifloats is not inferior to the performance of networks with the float 32 standard type without additional training. The results indicate that minifloats can be used to design efficient calculators for operation of neural networks