Notebook showing usage of the data detective package.
To load from a local db, we just need the path
Alternatively, to load from a cloud database we load from a json file containing the url as the url contains our credentials and we want to lkeep these a secret. To learn how the Google CLoud SQL recorder can be setup checkout https://github.com/robmarkcole/HASS-Google-Cloud-SQL
We use the DataParser class to load data from the database. This class performs the SQL queries and parses the returned data. The class holds the master pandas dataframe master_df.
Lets query a single sensor and demonstrate the data processing steps implemented by the library
fetch_data_by_list to query a list of numerical entities, must be from same domain and a minimum of 2 entities must be in the list. Returns a pandas dataframe.
Data-detective takes care of parsing data from the database, intelligently sorting out numerical and categorical data and formatting them correctly. Use
fetch_all_data to import all your db data into a pandas dataframe in memory -> this approach means it can take a while to load the data into memory, but subsequent processing and handling are much faster/easier.
NumericalSensors class is for parsing the numerical data. Lets create a dataframe for the numerical sensor data
We can access the list of sensor entities using the list_sensors attribute
Now lets look at the dataframe
Lets now check for correlations in the data using the all_corrs() method
Unsurprisingly the mean temperature is strongly correlated with all of the temperature sensors.
Interestingly my iphone battery level is somewhat inversely correlated with the travel time from home to waterloo, which gets longer late at night when my battery level is more likely to be low.
We can pass a list of entities to plot:
Even mix up lists and single entites
Currently we can plot a single binary sensor with the plot() method
Lets analyse the motion_at_home, create some features for day of week and time category, then analyse motion by these features.