top of page

LightDogs

December 2018

Individual Project

What is LightDogs?

LightDogs was a Sensing and IoT project, broken into two halves.
The first half of the project focussed on collecting data from a light sensor and a Twitter API
The second half of the project involved building a web app to visualise the collected data to an end user.

Skills

> Python

> Raspberry Pi

> Flask

> HTML

> CSS

> Sensors

> Git

This project aimed to build my skillset, whilst creating a gimmick web app and correlating two random data sources.

Part 1: Sensing

Light Data Collection
A photo diode light sensor with approximately human eye response was used to collect changes in outdoor light level for 3 weeks. A python script running on a Raspberry Pi controlled the sensor. It was mounted inside a box to isolate it form the indoor light changes. A sampling rate of 1/300 Hz was used to collect one datapoint every 5 minutes.
Tweet Data Collection
Tweepy, a python wrapper for the Twitter API was used to collect Tweets containing the word 'Dog' within a 15 km radius of the sensor location. The same sampling rate was used for both the light and twitter data.
The sentiment analysis library TextBlob was then used to analyse the sentiment of each tweet.
The light sensor mounted inside a box, isolating it from indoor light changes.
Communication
I communicated with the Raspberry Pi wirelessly using SSH, after having installed Weaved onto the Pi and using Remote.it. Screen was used to run each script in a different terminal whilst detaching from both.
The light and twitter data was stored in a new CSV every 12 hours. These were backed up every 24 hours using a Cronjob and Rclone to Box. 
The System

Part 2: Web App

Structure

The flask framework was used to build the LightDogs web app. This allowed python logic to be used with HTML markup to create the web pages.

Bootstrap was used to improve the UI and UX of the web app. This was customised using custom CSS.

Data 

Pandas dataframes were used to store and manipulate the collected data.  Bokeh was used to visualise the data in interactive plots.

The data was preprocessed to handle gaps using imputation. It was then downsampled into one hour bins. A sentiment score was calculated for each bin. 

After normalising this data, analysis was carried out. This included looking at trends, seasonality and noise, completing auto-correlation and correlation. 

Main Dashboard

Data Visualisation

Summary

Improvements

In the future, this project could be improved through four main measures:

  1. Further work on the UI and UX of LightDogs

  2. Javascript would be used instead of Flask to give greater functionality

  3. Data should be collected from a wider catchment area to improve the accuracy based on location.

  4. The question of ‘correlation is not causation’ needs to be further analysed, Light and Twitter data alone

    are not enough to draw concrete conclusions.

Download the full report:

bottom of page