Neural prophet is a time series forecasting library very similar to the Facebook prophet.
Neural prophet runs using pytorch, rather than fbprophet and it’s use of pystan.
It has some potential advantages by using stochastic gradient descent.
This brief post is exploring overfitting neural networks. It comes from reading the paper:
Towards Understanding Generalization of Deep Learning: Perspective of Loss Landscapes
Here we will find some data to see what the state of renewable energy production is within Europe.
The data is collected from:
In this post I will attempt to go over the steps between a simple linear regression
towards a non-linear probabilistic model built with a neural network.
Here we look at fitting a normal distribution to some data using Tensorflow Probability.
We compare the performance of adding a batch normalisation layer to a convolution neural network (CNN). For this we use the results from a previous post on creating a CNN for fashion MNIST data:
In this post we will build a Convolution Neural Network (CNN) in order to classify images from the fashion MNIST dataset. CNNs are commonly used with image data to efficiently exploit spatial relationships between nearby pixels.
Here we explore the theoretical coefficient distributions from a linear regression model. When fitting a regression model we can get estimates for the standard deviation of the coefficients. We use bootstrapping to get an empiracle distribution of the regression coefficients to compare against those distributions.
Parquet files are a columinar data format we can use to store dataframes. They can be stored in partitions, which can allow us to load only a subset of the data. This is useful is we are filtering the data, as we can do that without loading it all into memory.
This follows on from the previous post on fitting a gaussian distribution with pyro:
Fitting a Distribution with Pyro