Systems Science Friday Noon Seminar Series
Loading...
Format
Video: MP4; File size: 522 MB; Duration: 01:11:02
Date
3-17-2023
Abstract
Transformer network was first introduced in 2017 in the paper: Attention is all you need. They solve sequence-to-sequence tasks and are an improvement over Long Short Term Memory (LSTM) because they can handle a long range of dependencies. All of the previous architectures executed sequentially and did not use the GPU efficiently but transformers solved that problem with the multi-headed attention architecture. In this talk we will compare the, 1. Architectural differences between LSTM and Transformers. 2. Performance of LSTM vs. Transformer for a time series forecasting task based on the following criteria: a. Accuracy of prediction b. Complexity of the architecture c. Time to train
Biographical Information
My name is Sandy Dash and I have been a PhD student in Teuscher Lab since Fall 2021. I have worked on one time-series forecasting project using Deep Neural Networks (DNN) which won the best poster award at the Winter School in Indian Institute of Technology, Jodhpur, India. I am a Principal Engineer at Ampere Computing and a former Intel employee with over a decade of experience in DRAM Memory Subsystem.
Subjects
Transformer Networks, Forecasting
Disciplines
Systems Science
Persistent Identifier
https://archives.pdx.edu/ds/psu/39587
Rights
© 2023 Sandy Dash
© Copyright the author(s) IN COPYRIGHT: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
DISCLAIMER: The purpose of this statement is to help the public understand how this Item may be used. When there is a (non-standard) License or contract that governs re-use of the associated Item, this statement only summarizes the effects of some of its terms. It is not a License, and should not be used to license your Work. To license your own Work, use a License offered at https://creativecommons.org/
Recommended Citation
Dash, Sandy, "Modern Neural Networks for Time-Series Forecasting (Transformers vs. LSTMs)" (2023). Systems Science Friday Noon Seminar Series. 130.
https://archives.pdx.edu/ds/psu/39587