РЕФЕРАТИВНА БАЗА ДАНИХ "УКРАЇНІКА НАУКОВА"
Abstract database «Ukrainica Scientific»


Бази даних


Реферативна база даних - результати пошуку


Вид пошуку
Пошуковий запит: (<.>ID=REF-0000807528<.>)
Загальна кількість знайдених документів : 1

Kenshimov C. 
A comparison of convolutional neural networks for Kazakh sign language recognition / C. Kenshimov, S. Mukhanov, T. Merembayev, D. Yedilkhan // Eastern-Europ. J. of Enterprise Technologies. - 2021. - № 5/2. - С. 44-54. - Бібліогр.: 16 назв. - англ.

For people with disabilities, sign language is the most important means of communication. Therefore, more and more authors of various papers and scientists around the world are proposing solutions to use intelligent hand gesture recognition systems. Such a system is aimed not only for those who wish to understand a sign language, but also speak using gesture recognition software. In this paper, a new benchmark dataset for Kazakh fingerspelling, able to train deep neural networks, is introduced. The dataset contains more than 10 122 gesture samples for 42 alphabets. The alphabet has its own peculiarities as some characters are shown in motion, which may influence sign recognition. Research and analysis of convolutional neural networks, comparison, testing, results and analysis of LeNet, AlexNet, ResNet and EffectiveNet - EfficientNetB7 methods are described in the paper. EffectiveNet architecture is state-of-the-art (SOTA) and is supposed to be a new one compared to other architectures under consideration. On this dataset, we showed that the LeNet and EffectiveNet networks outperform other competing algorithms. Moreover, EffectiveNet can achieve state-of-the-art performance on nother hand gesture datasets. The architecture and operation principle of these algorithms reflect the effectiveness of their application in sign language recognition. The evaluation of the CNN model score is conducted by using the accuracy and penalty matrix. During training epochs, LeNet and EffectiveNet showed better results: accuracy and loss function had similar and close trends. The results of EffectiveNet were explained by the tools of the SHapley Additive exPlanations (SHAP) framework. SHAP explored the model to detect complex relationships between features in the images. Focusing on the SHAP tool may help to further improve the accuracy of the model.


Індекс рубрикатора НБУВ: З970.664.4

Рубрики:

Шифр НБУВ: Ж24320 Пошук видання у каталогах НБУВ 
Повний текст  Наукова періодика України 
  Якщо, ви не знайшли інформацію про автора(ів) публікації, маєте бажання виправити або відобразити більш докладну інформацію про науковців України запрошуємо заповнити "Анкету науковця"
 
Національна бібліотека України імені В. І. Вернадського
Відділ наукового формування національних реферативних ресурсів
Інститут проблем реєстрації інформації НАН України

Всі права захищені © Національна бібліотека України імені В. І. Вернадського