AI News, Sequence to Sequence (seq2seq) Recurrent Neural Network (RNN) for Time Series Prediction

Sequence to Sequence (seq2seq) Recurrent Neural Network (RNN) for Time Series Prediction

The goal of this project of mine is to bring users to try and experiment with the seq2seq neural network architecture.

Normally, seq2seq architectures may be used for other more sophisticated purposes than for signal prediction, let's say, language modeling, but this project is an interesting tutorial in order to then get to more complicated stuff.

You can find the French, original, version of this project in the French Git branch: https://github.com/guillaume-chevalier/seq2seq-signal-prediction/tree/francais Except the fact I made available an '.py' Python version of this tutorial within the repository, it is more convenient to run the code inside the notebook.

It is then that the notebook application (IDE) will open in your browser as a local server and it will be possible to open the .ipynb notebook file and to run code cells with CTRL+ENTER and SHIFT+ENTER, it is also possible to restart the kernel and run all cells at once with the menus.

Most of the time, you will have to edit the neural networks' training parameter to succeed in doing the exercise, but at a certain point, changes in the architecture itself will be asked and required.

A simple example would be to receive as an argument the past values of multiple stock market symbols in order to predict the future values of all those symbols with the neural network, which values are evolving together in time.

Note that it would be possible to obtain better results with a smaller neural network, provided better training hyperparameters and a longer training, adding dropout, and on.

Here is a prediction made on the actual future values, the neural network has not been trained on the future values shown here and this is a legitimate prediction, given a well-enough model trained on the task:

Disclaimer: this prediction of the future values was really good and you should not expect predictions to be always that good using as few data as actually (side note: the other prediction charts in this project are all 'average' except this one).

Other more creative input data could be sine waves (or other-type-shaped waves such as saw waves or triangles or two signals for cos and sin) representing the fluctuation of minutes, hours, days, weeks, months, years, moon cycles, and on.

It is also interesting to know where is the bitcoin most used: http://images.google.com/search?tbm=isch&q=bitcoin+heatmap+world With all the above-mentionned examples, it would be possible to have all of this as input features, at every time steps: (BTC/USD, BTC/EUR, Dow_Jones, SP_500, hours, days, weeks, months, years, moons, meteo_USA, meteo_EUROPE, Twitter_sentiment).

Data Mining- Forecasting using Neural Networks in RStudio

The main concept of this Data Mining project is to forecast the Closing prices of the stock market based on the past data sets. Note: Watch with Sub-titles :)

Predicting Multiple Discrete Values with Multinomials, Neural Networks and { nnet } - ML with R

Using R and the multinom function from the { nnet } package we can easily predict discrete / factors of more than 2 levels. With the help of Repeated Cross ...

Data Predictor Using Neural Networks

In this project , I built a program using neural networks in MATLAB for predicting the pollution in a lake near chemical plant in Saudi Arabia. I received the daily ...

Note RNN

Sample generated from an LSTM note prediction model.

BigQuery and Cloud Machine Learning: advancing neural network predictions (Google Cloud Next '17)

The real value of BigQuery is not its speed. It's the power of "democratizing enterprise data." Because of BigQuery's scalability, you can isolate any workload on ...

Regression forecasting and predicting - Practical Machine Learning Tutorial with Python p.5

In this video, make sure you define the X's like so. I flipped the last two lines by mistake: X = np.array(df.drop(['label'],1)) X = preprocessing.scale(X) X_lately ...

Supervised Machine Learning: Sequence Prediction (Part 4 of 8)

(Part 4 of 8) Jon McLoone explains the supervised machine learning technique of sequence prediction and how it differs from prediction, demonstrating ...

Neural Network Calculation (Part 1): Feedforward Structure

From In this series we will see how a neural network actually calculates its values. This first video takes a look at the structure of ..

Practical 2.2 – nn package

Neural Networks – An overview on Torch's nn package Full project: Regression example: ..

Deep Learning with Tensorflow - The Sequential Problem

Enroll in the course for free at: Deep Learning with TensorFlow Introduction The majority of data ..