CINXE.COM
{"title":"Multi-Context Recurrent Neural Network for Time Series Applications","authors":"B. Q. Huang, Tarik Rashid, M-T. Kechadi","volume":10,"journal":"International Journal of Computer and Information Engineering","pagesStart":3086,"pagesEnd":3096,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/3524","abstract":"<p>this paper presents a multi-context recurrent network for time series analysis. While simple recurrent network (SRN) are very popular among recurrent neural networks, they still have some shortcomings in terms of learning speed and accuracy that need to be addressed. To solve these problems, we proposed a multi-context recurrent network (MCRN) with three different learning algorithms. The performance of this network is evaluated on some real-world application such as handwriting recognition and energy load forecasting. We study the performance of this network and we compared it to a very well established SRN. The experimental results showed that MCRN is very efficient and very well suited to time series analysis and its applications.<\/p>\r\n","references":"[1] M. Boden, `A guide to recurrent neural networks and\r\nbackpropagation',The DALLAS project. Report from the NUTEKsupported\r\nproject AIS-8, SICS.Holst: Application of Data Analysis with\r\nLearning Systems, (2001).\r\n[2] M.A. Castano, F. Casacuberta, and A. Bonet, Training Simple recurrent\r\nNetworks Through Gradient Descent Algorithm, volume 1240 of ISBN\r\n3-540-63047-3, chapter Biological and Arti cial Computation: From\r\nNeuroscience to Technology, pp. 493--500, Eds. J. Mira and R.\r\nMoreno-Diaz and J. Cabestany, Springer Verlag, 1997.\r\n[3] W. Charytoniuk and M-S. Chen, `Very short-term load forecasting\r\nusing neural networks', IEEE Tran. On Power Systems, 15(1), 1558---\r\n1572, (2000).\r\n[4] B.J. Chen, M.W. Change, and C.J. Lin, `Eunite network competition:\r\nElectricity load forecasting', Technical report, In EUNITE 2001\r\nsymposium, a forecasting competition, (2001).\r\n[5] Y. Cheng, T.W. Karjala, and D.M. Himmelblau, `Closed loop nonliner\r\nprocess identi cation using internal recurrent nets', In Neural\r\nNetworks, 10(3), pp. 573--586, (1997).\r\n[6] A. Corradini and P. Cohen, `Multimodal speech-gesture interface for\r\nhands-free painting on virtual paper using partial recurrent neural\r\nnetworks for gesture recognition', in Proc. of the Int'l Joint Conf. on\r\nNeural Networks (IJCNN'02), volume 3, pp. 2293--2298, (2002).\r\n[7] B. de Vries and J.C. Principe, `A theory for neural networks with time\r\ndelays', in NIPS-3: Proceedings of the 1990 conference on Advances in\r\nneural information processing systems 3, pp. 162--168, San Francisco,\r\nCA, USA, (1990). Morgan Kaufmann Publishers Inc.\r\n[8] Georg Dorffner, `Neural networks for time series processing', Neural\r\nNetwork World, 6(4), pp. 447--468, (1996).\r\n[9] W. Duch and N.Jankowski, `Transfer functions: Hidden possibilities\r\nfor better neural networks', in ESANN'2001 proceedings European\r\nSymposium on Arti cial Neural Networks, ISBN 2-930307 01-3, pp.\r\n25-27, Belgium, (2001). D-Facto public.\r\n[10] J.L. Elman, `Finding structure in time', Cognitive Science, 14(2),\r\npp.179--211, (1999).\r\n[11] D. Esp, `Adaptive logic networks for east slovakian electrical load\r\nforecasting', Technical report, In EUNITE 2001 symposium, a\r\nforecasting competition, (2001).\r\n[12] D.V. Prokhorov E.W. Saad and D.C. Wunsch, `Comparative study of\r\nstock trend prediction using time delay, recurrent and probabilistic\r\nneural networks', IEEE Transactions on Neural Networks, 6(9),\r\nPP.1456--1470, (1998).\r\n[13] L. Fausett, Backpropagation Through Time and Derivative Adaptive\r\nCritics: A Common Framework for Comparison, chapter Englewood\r\nCliffs, NJ: Prentice Hall, 1994.\r\n[14] G. Gross and F. D. Galianan, `Short-term load forecasting', In\r\nProceedings of the IEEE., 75(12), 1558--1572, (1987).\r\n[15] I. Guyon, L. Schomaker, R. Plamondon, M. Liberman, and S. janet,\r\n`Unipen project of on-line data exchange and recognizer benchmarks',\r\nin Proceedings of the 12th International Conference on Pattern\r\nRecognition, ICPR'94, pp. 29--33, Jerusalem, Israel, (October 1994).\r\n[16] S. Haykin, Neural Networks, A Comprehensive Foundation,\r\nMacMillan Publishing Company, New York, 1994.\r\n[17] A. Herve and E. Betty, `Neural networks, quantitative applications', in\r\nIn the Social Sciences, volume 124, London: Sage Publications, (1999).\r\n[18] B. Q. Huang and M-T. Kechadi, `A recurrent neural network\r\nrecogniser for online recognition of handwritten symbols.', in ICEIS\r\n(2), pp. 27--34, (2005).\r\n[19] B. Q. Huang, T. Rashid, and T. Kechadi, `A new modi ed network\r\nbased on the elman network', in Proceedings of IASTED International\r\nConference on Arti cial Intelligence and Application, ed., M. H.\r\nHamza, volume 1 of ISBN: 088986-404-7, pp. 379--384, Innsbruck,\r\nAustria, (2004). ACTA Press.\r\n[20] M.I. Jordan, `Attractor dynamics and parallelism in a connectionist\r\nsequential machine.', in Proceedings of the 8th Annual Conference of\r\nthe Cognitive Science Society, Englewood Cliffs, NJ: Erlbaum, pp.\r\n531--546. Reprinted in IEEE Tutorials Series, New York: IEEE\r\nPublishing Services, 1990, (1986).\r\n[21] I. King and J. Tindle, `Storage of half hourly electric metering data and\r\nforecasting with arti cial neural network technology', Technical\r\nreport, In EUNITE 2001 symposium, a forecasting competition, (2001).\r\n[22] W. Kowalczyk, `Averaging and data enrichment: Two approaches to\r\nelectricity load forecasting', Technical report, In EUNITE 2001\r\nsymposium, a forecasting competition, (2001).\r\n[23] S. Lawrence, C.L. Giles, and S. Fong, `Natural language grammatical\r\ninference with recurrent neural networks', IEEE Trans. on Knowledge\r\nand Data Engineering, 12(1), pp. 126--140, (2000).\r\n[24] A. Lo, H. Mamaysky, and J. Wang, `Foundations of technical analysis:\r\nComputational algorithms, statistical inference, and empirical\r\nimplementation', Journal of Finance 55, pp. 1705--1765, (2000).\r\n[25] Yee-Ling LU, Man-Wai MAK, and Wan-Chi SIU, `Application of a\r\nfast real time recurrent learning algorithm to text-to-phone conversion',\r\nin Proceedings of the International Conference of Neural Networks,\r\nvolume 5, pp. 2853--2857, (1995).\r\n[26] Simone Marinai, Marco Gori, and Giovanni Soda, `Arti cial neural\r\nnetworks for document analysis and recognition.', IEEE Trans. Pattern\r\nAnal. Mach. Intell., 27(1), pp. 23--35, (2005).\r\n[27] A. D. Papalxopoulos and T. C. Hiterbeg, `A regression-based approach\r\nto short-term load forecasting', In IEEE Tran. On Power Systems, 4(1),\r\npp. 1535--1547, (1990).\r\n[28] D. Park, M. El-Sharkawia, R. Marks, A. Atlas, and M. Damborg,\r\n`Electic load forecasting using arti cial neural networks', IEEE Trans.\r\non Power Systems, 6(2), 442--449, (1991).\r\n[29] D. C. Plaut, `Semantic and associative priming in a distributed attractor\r\nnetwork', in Proceedings of the 17th Annual Conference of the\r\nCognitive Science Society, pp. 37--42, Hillsdale, (1995). NJ:Erlbaum.\r\n[30] D. Prokhorov, Backpropagation Through Time and Derivative\r\nAdaptive Critics: A Common Framework for Comparison, chapter\r\nLearning and Approximate Dynamic Programming, Wiley, 2004.\r\n[31] T. Rashid, B. Q. Huang, and T. Kechadi, `A new simple recurrent\r\nnetwork with real-time recurrent learning process', in The 14th Irish\r\nArtifcial Intelligence and Cognitive Science (AICS'03), ed., Padraig\r\nCunningham, volume 1, pp. 169--174, Dublin, Ireland, (2003).\r\n[32] T. Rashid and T. Kechadi, `A practical approach for electricity load\r\nforecasting', in The proceeding WEC'05, The ThirdWorld Enformatika,\r\ned., C. Ardal, volume 5 of ISBN 975-98458-4-9, pp. 201--205, Isanbul,\r\nTurky, (2005). ACTA Press.\r\n[33] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, Learning Internal\r\nRepresentations by Error Propagation In D. E. Rumelhart, et al. (Eds.),\r\nParallel Distributed Processing: Explorations in the Microstructures\r\nof Cognition, 1: Foundations, MA: MIT Press, Cambridge, 1962.\r\n[34] M. Schnekel, I. Guyon, and D. Henderson, `On-line cursive script\r\nrecognition using time delay networks and hidden markove models', in\r\nProc. ICASSP'94, volume 2, pp. 637--640, Adelaide, Australia, (April\r\n1994).\r\n[35] P. Stagge and B. Sendho, `Organization of past states in recurrent\r\nneural networks: Implicit embedding', in Proc. The Internation\r\nconference Computational Intelligence for Modelling, Control &\r\nAutomation, pp. 21--27, Amsterdam, (1999). IOS Press.\r\n[36] J.C. Tomasz and M.Z. Jacek, `Neural network tools for stellar light\r\nprediction', in Proc. of the IEEE Aerospace Conference, volume 3, pp.\r\n415--422, Snowmass, Colorado, USA, (February 1997).\r\n[37] P. J. Werbos, `Backpropagation through time: What it does and how to\r\ndo it', in Proceedings of the IEEE, volume 78, pp. 1550--1560, (1990).\r\n[38] William H. Wilson, `Learning performance of networks like elman's\r\nsimple recurrent netwroks but having multiple state vectors', Workshop\r\nof the 7th Australian Conference on Neural Networks, Australian\r\nNational University Canberra, (1996).\r\n[39] Shi XH, YC. Liang, and X. Xu, `An improved elman model and\r\nrecurrent bck-propagation control neural networks', Journal of\r\nSoftware, 6(14), 1110--1119, (2003).","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 10, 2007"}