While learning by doing was definitely one of the reasons I decided to embark the project, I for example wanted to control the radiators located in the attic: Not necessarily by switching power on/off, but getting alarms if I’m heating it too much or little, so that I can tune the power manually. Saving some money, in practice. Also, it is nice the get reminders from humidor that the cigars are getting dried out 😉
I personally learned several things while working on it, and via this blog post, hopefully you can too!
Idea of the project is relatively simple: Place a few RuuviTag -sensors around the house, collect the data and push it into AWS cloud for permanent storage and additional processing. From there, several solutions can be built around the data, visualisation and alarms being being only few of them.
Solution is built using AWS serverless technologies that keeps the running expenses low while requiring almost non-existing maintenance. Following code samples are only snippets from the complete solution, but I’ve tried to collect the relevant parts.
Tag sensors broadcasts their data (humidity, temperature, pressure etc.) via Bluetooth LE periodically. Because Ruuvi is an open source friendly product, there are already several ready-made solutions and libraries to utilise. I went with node-ruuvitag, which is a Node.js module (Note: I found that module works best with Linux and Node 8.x but you may be successful with other combinations, too).
Raspberry Pi runs a small Node.js application that both listens the incoming messages from RuuviTags and forwards them into AWS IoT service. App communicates with AWS cloud using thingShadow client, found in AWS IoT Device SDK module. Application authenticates using X.509 certificates generated by you or AWS IoT Core.
const sensors = [
{ id: 'f16bcba62cbd', name: 'Room', notified: 0, color: 'green', tag: null },
{ id: 'f9cc863b43f1', name: 'Attic', notified: 0, color: 'cyan', tag: null },
{ id: 'c64981244133', name: 'Humidor', notified: 0, color: 'magenta', tag: null },
{ id: 'fb33d5b20a3d', name: 'Outside', notified: 0, color: 'blue', tag: null },
]
async function registerService(shadow, sensors, updateInterval) {
const tags = await ruuvi.findTags()
console.log(colors.gray('Found', tags.length, 'tags'))
tags.forEach(tag => {
const sensor = sensors.find(sensor => sensor.id === tag.id)
// Register if not subscribed yet
if (!sensor.tag) {
sensor.tag = tag
sensor.tag.on('updated', getTagDataHandler(shadow, sensor, sensor.tag, updateInterval))
}
})
}
function startService(shadow, sensors, registerInterval, updateInterval) {
// Run registration to ensure all tags get collected
registerService(shadow, sensors, updateInterval)
setInterval(() => {
registerService(shadow, sensors, updateInterval)
}, registerInterval)
}
// Create and run service
const shadow = createShadow()
shadow.client.on('connect', () => {
console.log(colors.bgCyan('Connected to AWS IoT'))
shadow.client.register(shadow.path, {}, () => {
startService(shadow, sensors, TAG_REGISTER_INTERVAL, TAG_UPDATE_INTERVAL)
})
})
app.js
The scripts runs as a Linux service. While tags broadcast data every second or so, the app in Raspberry Pi forwards the data only once in 10 minutes for each tag, which is more than sufficient for the purpose. This is also an easy way to keep processing and storing costs very low in AWS.
When building an IoT or big data solution, one may initially aim for near real-time data transfers and high data resolutions while the solution built on top of it may not really require it. Alternatively, consider sending data in batches once an hour and with 10 minute resolution may be sufficient and is also cheaper to execute.
When running the broadcast listening script in Raspberry Pi, there are couple things to consider:
With these in place, the setup have been working without issues, so far.
Once the data hits the AWS IoT Core there can be several rules for handling the incoming data. In this case, I setup a lambda to be triggered for each message. AWS IoT provides also a way to do the DynamoDB inserts directly from the messages, but I found it more versatile and development friendly approach to use the lambda between, instead.
import 'source-map-support/register'
import * as omit from 'lodash.omit'
import { IoTEvent } from '../interfaces'
import { ScrewedTable, getMapper } from '../tables'
export const handle = async (event: IoTEvent, _context) : Promise<void> => {
console.log('Handling event:', event)
const mapper = getMapper()
// Store data into table, drop id and time from data field
const entry = Object.assign(new ScrewedTable, {
type: 'tag',
id: `${event.id}-${event.time}`, // Unique identifier for single tag and time
time: event.time,
data: JSON.stringify(omit(event, ['id', 'time'])),
})
console.debug('Writing entry:', entry);
await mapper.put(entry)
}
iot.ts
DynamoDB works well as permanent storage in this case: Data structure is simple and service provides on demand based scalability and billing. Just pay attention when designing the table structure and make sure it fits with you use cases as changes done afterwards may be laborious. For more information about the topic, I recommend you to watch a talk about Advanced Design Patterns for DynamoDB.
Once we have the data stored in semi structured format in AWS cloud, it can be visualised or processed further. I set up a periodic lambda to retrieve the data from DynamoDB and generate CSV files into public S3 bucket, for React clients to pick up. CSV format was preferred over for example JSON to decrease the file size. At some point, I may also try out using the Parquet -format and see if it suits even better for the purpose.
The React application fetches the CSV file from S3 using custom hook and passes it to Highcharts -component.
During my professional career, I’ve learnt the data visualisations are often causing various challenges due to limitations and/or bugs with the implementation. After using several chart components, I personally prefer using Highcharts over other libraries, if possible.
export function useTagData(tagId: string): IoTData {
const initialData: IoTData = []
const [data, setData] = useState(initialData)
useEffect(() => {
axios.get(`https://iot.mydomain.com/tag-${tagId}.csv`)
.then(csvData => {
const output = parse(csvData.data, {
skip_empty_lines: true,
})
setData(output
.filter((entry: any, index: number) => index > 0)
.sort((entry: any) => parseInt(entry[0]))
.map((entry: any) => {
// Map data into format suitable for Highcharts
})
)
})
})
return data
}
hooks.ts
Visualisations works well to see the status and how the values vary by the time. However, in case something drastic happens, like humidor humidity gets below preferred level, I’d like to get an immediate notification about it. This can be done for example using Telegram bots:
export async function notifyBot(token: string, chatId: string, message: string): Promise<void> {
await axios.post(`https://api.telegram.org/bot${token}/sendMessage`, {
chat_id: chatId,
text: message,
parse_mode: 'Markdown'
})
return
}
telegram.ts
By now, you should have some kind of understanding how one can combine IoT sensor, AWS services and outputs like web apps and Telegram nicely together using serverless technologies. If you’ve built something similar or taken very different approach, I’d be happy hear it!
Price tag
Building and running your own IoT solution using RuuviTags, Raspberry Pi and AWS Cloud does not require big investments. Here are some approximate expenses from the setup:
And after looking into numbers, there are several places to optimise as well. For example, some lambdas are executed more often than really needed.
Next steps
I’m happy say this hobby project has achieved that certain level of readiness, where it is running smoothly days through and being valuable for me. As a next steps, I’m planning to add some kind of time range selection. As the amount of data is increasing, it will be interesting to see how values vary in long term. Also, it would be a good exercise to integrate some additional AWS services, detect drastic changes or communication failures between device and cloud when they happen. This or that, at least now I have a good base for continue from here or build something totally different next time 🙂
This project is no by means a snowflake and has been inspired by existing projects and work:
At Nordcloud we are always looking for talented people. If you enjoy reading this post and would like to work with public cloud projects on a daily basis — check out our open positions here.
☞ Scratch Programming for Raspberry Pi
☞ Raspberry Pi 3 Day Project: Retro Gaming Suite
☞ Raspberry Pi Projects : Build a Media Centre Computer
☞ Build Your Own Super Computer with Raspberry Pis
☞ PiBot: Build Your Own Raspberry Pi Powered Robot
#raspberry-pi