Recurrent neural networks
-
Luận văn Thạc sĩ Máy tính "Nghiên cứu bài toán bóc tách thông tin trong chứng minh thư sử dụng học sâu" trình bày các nội dung chính sau: Tổng quan về phát hiện, nhận dạng ký tự, sự phát triển của học máy và học sâu; Giới thiệu về mạng Convolution Neural Network; Mô hình mạng pixellink cho phát hiện văn bản; Giới thiệu về Convolution Recurrent Neural Network; Cài đặt thử nghiệm và kết quả.
57p viabigailjohnson 10-06-2022 30 7 Download
-
Tác giả mới dựa vào một số mô hình toán để dự báo như mô hình nhân, mô hình trung bình, mô hình ARIMA và SARIMA kết hợp để phân tích cũng như dự báo và ứnng dụng mô hình mạng LSTM là một phần đặc biệt của mạng RNN (Recurrent Neural Networks) để phân tích và dự báo bằng phương pháp học sâu. Mời các bạn cùng tham khảo.
63p capheviahe27 23-02-2021 31 6 Download
-
With Europe’s ageing fleet of nuclear reactors operating closer to their safety limits, the monitoring of such reactors through complex models has become of great interest to maintain a high level of availability and safety.
9p christabelhuynh 29-05-2020 10 1 Download
-
Tài liệu tham khảo về thuật toán tính Derivation of Delta Rules
6p haiph37 15-09-2010 114 11 Download
-
Convergence of Online Learning Algorithms in Neural Networks An analysis of convergence of real-time algorithms for online learning in recurrent neural networks is presented. For convenience, the analysis is focused on the real-time recurrent learning (RTRL) algorithm for a recurrent perceptron. Using the assumption of contractivity of the activation function of a neuron and relaxing the rigid assumptions of the fixed optimal weights of the system, the analysis presented is general and is applicable to a wide range of existing algorithms....
9p doroxon 12-08-2010 91 9 Download
-
A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks A normalised version of the real-time recurrent learning (RTRL) algorithm is introduced. This has been achieved via local linearisation of the RTRL around the current point in the state space of the network. Such an algorithm provides an adaptive learning rate normalised by the L2 norm of the gradient vector at the output neuron. The analysis is general and also covers simpler cases of feedforward networks and linear FIR filters...
12p doroxon 12-08-2010 91 16 Download
-
Data-Reusing Adaptive Learning Algorithms In this chapter, a class of data-reusing learning algorithms for recurrent neural networks is analysed. This is achieved starting from a case of feedforward neurons, through to the case of networks with feedback, trained with gradient descent learning algorithms. It is shown that the class of data-reusing algorithms outperforms the standard (a priori ) algorithms for nonlinear adaptive filtering in terms of the instantaneous prediction error.
14p doroxon 12-08-2010 100 10 Download
-
Stability Issues in RNN Architectures Perspective The focus of this chapter is on stability and convergence of relaxation realised through NARMA recurrent neural networks. Unlike other commonly used approaches, which mostly exploit Lyapunov stability theory, the main mathematical tool employed in this analysis is the contraction mapping theorem (CMT), together with the fixed point iteration (FPI) technique. This enables derivation of the asymptotic stability (AS) and global asymptotic stability (GAS) criteria for neural relaxive systems.
19p doroxon 12-08-2010 104 9 Download