All Projects → JackKelly → neuralnilm_prototype

JackKelly / neuralnilm_prototype

Licence: MIT license
No description or website provided.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to neuralnilm prototype

Gemello
No description or website provided.
Stars: ✭ 18 (-57.14%)
Mutual labels:  energy, nilm
HPC
A collection of various resources, examples, and executables for the general NREL HPC user community's benefit. Use the following website for accessing documentation.
Stars: ✭ 64 (+52.38%)
Mutual labels:  energy
Reactor-and-Turbine-control-program
This is my Reactor- and Turbine control program for ComputerCraft and BigReactors
Stars: ✭ 18 (-57.14%)
Mutual labels:  energy
tuyapower
Python module to read status and energy monitoring data from Tuya based WiFi smart devices. This includes state (on/off), current (mA), voltage (V), and power (wattage).
Stars: ✭ 101 (+140.48%)
Mutual labels:  energy
units
A run-time C++ library for working with units of measurement and conversions between them and with string representations of units and measurements
Stars: ✭ 114 (+171.43%)
Mutual labels:  energy
AMO-Tools-Suite
AMO-Tools-Suite is an energy efficiency calculation library in C++ with optional Nan Node add-on bindings for the Department of Energy Advanced Manufacturing Office (DOE AMO) Desktop, also known as MEASUR.
Stars: ✭ 16 (-61.9%)
Mutual labels:  energy
Scaphandre
⚡ Electrical power consumption metrology agent. Let scaph dive and bring back the metrics that will help you make your systems and applications more sustainable !
Stars: ✭ 246 (+485.71%)
Mutual labels:  energy
comparison groups
Repository for discussion of Comparison Group topics
Stars: ✭ 22 (-47.62%)
Mutual labels:  energy
ioBroker.sourceanalytix
Detailed analysis of your Energy, gas and liquid consumptions
Stars: ✭ 61 (+45.24%)
Mutual labels:  energy
SparseNILM
The super-state hidden Markov model disaggregator that uses a sparse Viterbi algorithm for decoding. This project contains the source code that was use for my IEEE Transactions on Smart Grid journal paper.
Stars: ✭ 74 (+76.19%)
Mutual labels:  nilm
SpineOpt.jl
A highly adaptable modelling framework for multi-energy systems
Stars: ✭ 25 (-40.48%)
Mutual labels:  energy
PowerSimulations.jl
Julia for optimization simulation and modeling of PowerSystems. Part of the Scalable Integrated Infrastructure Planning Initiative at the National Renewable Energy Lab.
Stars: ✭ 202 (+380.95%)
Mutual labels:  energy
oeplatform
Repository for the code of the Open Energy Platform (OEP) website. The OEP provides an interface to the Open Energy Family
Stars: ✭ 49 (+16.67%)
Mutual labels:  energy
ontology
Repository for the Open Energy Ontology (OEO)
Stars: ✭ 71 (+69.05%)
Mutual labels:  energy
learnergy
💡 Learnergy is a Python library for energy-based machine learning models.
Stars: ✭ 57 (+35.71%)
Mutual labels:  energy
OpenESS
KETI Data Platform : OpenESS(Energy Storage System)
Stars: ✭ 19 (-54.76%)
Mutual labels:  energy
Energy-Calculator
🌏 Simple Energy-Calculator Script In Python
Stars: ✭ 30 (-28.57%)
Mutual labels:  energy
ioBroker.tado
Tado cloud connector to control Tado devices
Stars: ✭ 25 (-40.48%)
Mutual labels:  energy
energy-data
Data on energy by Our World in Data
Stars: ✭ 139 (+230.95%)
Mutual labels:  energy
flexmeasures
The intelligent & developer-friendly EMS to support real-time energy flexibility apps, rapidly and scalable.
Stars: ✭ 79 (+88.1%)
Mutual labels:  energy

Neural NILM Prototype

Early prototype for the Neural NILM (non-intrusive load monitoring) software. This software will be completely re-written as the Neural NILM project.

This is the software that was used to run the experiments for our Neural NILM paper.

Note that Neural NILM Prototype is completely unsupported and is a bit of a mess!

If you really want to re-implement my Neural NILM ideas, then I recommend that you start from scratch using a modern DeepLearning framework like TensorFlow. Honestly, it will be easier in the long run!

Directories:

  • neuralnilm contains re-usable library code
  • scripts contains runnable experiments
  • notebooks contains IPython Notebooks (mostly for testing stuff out)

The script which specified the experiments I ran in my paper is e567.py.

(It's a pretty horrible bit of code! Written in a rush!) In that script, you can see the SEQ_LENGTH for each appliance and the N_SEQ_PER_BATCH (the number of training examples per batch). Basically, the sequence length varied from 128 (for the kettle) up to 1536 (for the dish washer). And the number of sequences per batch was usually 64, although I had to reduce that to 16 for the RNN for the longer sequences.

The nets took a long time to train (I don't remember exactly how long but it was of the order of about one day per net per appliance). You can see exactly how long I trained each net in that e567.py script (look at the def net_dict_<architecture> functions and look for epochs.... that's the number of batches (not epochs!) given to the net during training). It's 300,000 for the rectangles net, 100,000 for the AE and 10,000 for the RNN (because the RNN was a lot slower to train... I chose these numbers because the nets appeared to stop learning after this number of training iterations).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].