Difference between revisions of "User:Cnapun"

From REU@MU
Jump to: navigation, search
(Personal Information)
Line 18: Line 18:
 
* Got Anaconda, Tensorflow, and Keras installed on a couple lab computers
 
* Got Anaconda, Tensorflow, and Keras installed on a couple lab computers
 
* Learned how to access customer data
 
* Learned how to access customer data
 +
 +
== Reading ==
 +
Here are some papers I have read, skimmed, and partially read:
 +
* [https://arxiv.org/abs/1610.09460 "Building Energy Load Forecasting using Deep Neural Networks"]
 +
** Discusses the direct application of LSTMs to load forecasting
 +
* [http://ieeexplore.ieee.org/document/6796853/ "Training Recurrent Networks by Evolino"]
 +
** Describes the use of genetic algorithms to train RNNs
 +
** Should offer better performance than Echo State Networks (ESNs)
 +
** No description of how crossover and mutate operations work; I referred to [http://davidmontana.net/papers/hybrid.pdf "Neural Network Weight Selection Using Genetic Algorithms"] for a more complete explanation
 +
* Several papers on hybrid methods
 +
** [https://openreview.net/pdf?id=ByD6xlrFe "Hybrid Neural Networks Over Time Series For Trend Forecasting"]
 +
** [http://www.sciencedirect.com/science/article/pii/S0925231201007020 "Time series forecasting using a hybrid ARIMA and neural network model"]
 +
** [http://ieeexplore.ieee.org/document/5433249/ "Intelligent Hybrid Wavelet Models for Short-Term Load Forecasting"]
 +
** [http://ieeexplore.ieee.org/document/5340640/ "Short-Term Load Forecasting: Similar Day-Based Wavelet Neural Networks"]
 +
* A couple review papers
 +
** [http://ieeexplore.ieee.org/document/7581373/ "Short-Term Load Forecasting Methods: A Review"]
 +
** [https://arxiv.org/abs/1705.04378 "An overview and comparative analysis of Recurrent Neural Networks for Short Term Load Forecasting"], which gives a good, current overview of methods
 +
* Some papers on LSTMs, training, and possible improvements
 +
** [https://arxiv.org/abs/1409.2329 "Recurrent Neural Network Regularization"]
 +
** [https://arxiv.org/abs/1512.05287 "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks"]
 +
** [https://arxiv.org/abs/1609.07959 "Multiplicative LSTM for sequence modelling"]
 +
** [https://arxiv.org/abs/1610.09513 "Phased LSTM: Accelerating Recurrent Network Training for Long or Event-based Sequences"]
 +
** [https://arxiv.org/abs/1511.01432 "Semi-supervised Sequence Learning"]

Revision as of 20:55, 10 June 2017

Personal Information

  • Case Western Reserve University Class of 2019
  • Applied Mathematics and Computer Science Major

Weekly Log

Week 0 (30 May - 2 Jun)

  • Met Dr. Povinelli 4 times: to discuss possible project topics, to get oriented with the lab, to decide on a project topic, and to establish weekly milestones
  • Obtained MU and MSCS account logins and ID card
  • Read most of "Learning Deep Architectures for AI"
  • Read section on Sequence Modeling (Ch 10) of The Deep Learning book
  • Read various other papers

Week 1 (5 Jun - 9 Jun)

  • Attended GasDay camp and learned about what GasDay does
  • Attended responsible conduct of research training
  • Continued to read papers
  • Decided to use Tensorflow and Keras for now
  • Got Anaconda, Tensorflow, and Keras installed on a couple lab computers
  • Learned how to access customer data

Reading

Here are some papers I have read, skimmed, and partially read: