Chatbot Pull Request Watcher / Notifier

Demo of my hackathon to create a Jenkins Build Watcher for Hubot that watches an active build in Jenkins and provide updates when the build completes.

Chatbot Out of Office

I haven’t shared many new projects recently, but that’s because I’m been hacking at work – at CA Technologies (formerly Rally Software).  Every 3 months, we take a week to hack on something that we feel is valuable to the company.  The following are a few of my most recent hacks.  Enjoy!

Demo of my hackathon to create an Out of Office for Hubot that lets people know when you’re out of the office or busy:

Source code available at

Hubot script available at

TinkurWash – A Talking Dishwasher

I love technology because it enables new ways of solving old problems.

TinkurWash HipChat

Working in an office with only a single kitchen and 50+ busy software engineers, testers, and designers, my coworkers use quite a few dishes throughout each day. The kitchen is the hub of the office – a communal space for eating, talking, and taking a break. However it’s also easy for dishes to pile up, especially since there’s only one dishwasher.  To maximize the capacity of the dishwasher its important to make sure it gets loaded, washed, and emptied as soon as possible so its ready for the next set of dishes. While most of my coworkers help with the dishwasher from time to time, its use isn’t always optimized with dishes stacking up on the counter and in the sink.  Someone may not realize the dishwasher is full and needs started.  Someone else may think the dishwasher is still washing when really its dry and needs unloaded, especially since the office dishwasher’s status light is on the top of the door and the dishwasher is fairly quiet.


I wanted to give the dishwasher a voice – a social experiment. If the dishwasher could talk to people – telling them when it needs run or when it needs unloaded – would it help optimizing the flow of dishes? Most people want to be helpful, but sometimes they need better instrumentation and information.

TinkurWash v1.0.0

I started prototyping TinkurWash earlier in the year.  The initial concept was to instrument the dishwasher using vibration, temperature, tilt, etc to deduce when the dishwasher is in its various cycles. I wanted to instrument the dishwasher in a non invasive manner, allowing the solution to be added to an existing dishwasher without having to wire directly into the dishwasher. I decided to use an Arduino as the base platform, also deciding to use the Adafruit CC3000 WiFi shield as this project needed to be located near the dishwasher potentially far away from wired networking connections.


The initial prototype used a temperature sensor for heat detection, a piezo sensor for vibration detection, and a tilt sensor for detecting if the door is open, and a RGB led to provide visual feedback about the dishwasher’s status. This was my first time working with a piezo sensor and tilt sensor. When working with new sensors, I start by writing a small unit of code that demonstrates the capabilities of the sensor. I save this sketch in a unit test directory of my project. This not only helps me to learn about the sensor, buy also provides documentation for future reference and a test in case there are issues in the future. These unit examples are especially useful when debugging the sensor after it’s integrates into a larger code base with multiple sensors and other hardware components.


After integrating the unit examples together, the next step was to learn how to use this instrumentation to determine the state of the dishwasher.  I temporarily mounted an Arduino with sensors to the dishwasher. I ran the dishwasher and recorded a log of sensor readings over time to understand the values that occurred during washing, drying, unloading, etc. Vibration steadily increased as the dishwasher began fling with water, washed, and rinsed the dishes. Temperature rose more slowly, peaking part way through the washing and rinsing cycle and continuing through the drying cycles. So how to use this information?


I outlined the questions I needed to answer.


Is the dishwasher running?
When the dishwasher is not running, there is very little vibration and it occurs inconsistently when it does occur.  These random vibrations might occur when opening the dishwasher to add dirty dishes, opening an nearby cabinet, or even walking nearby. However when the dishwasher is washing and rinsing it generates significant and fairly sustained vibration over a long period of time. However, the levels of vibration can very between different dishwashers, so I implemented a function to baseline the ambient vibration when TinkurWash boots up. I also implemented a median library and function which records the 10 most recent vibration readings, using the median value to eliminate outlier data. When the median vibration is greater than 3x the baseline vibration, the dishwasher is assumed to be in a running status.  The LED also changes color when the dishwasher is in a washing status.


Is the dishwasher drying?
When the dishwasher is drying, vibrations return to near baseline levels, however temperature continues to remain high during drying. While a temperature sensor did provide the ability to measure temperature and dedice the drying cycle length, it also required mounting the temperature sensor fairly close to the dishwasher. After some consideration of the best mounting options, I decided to remove the sensor. From a user’s perceptive, they could remove the dishes from the dishwasher after the washing cycle and dry the dishes by hand – a realistic scenario. Therefore, I added another state to differentiate between washing and drying, changing the LED color as well to provide a visual indicator so a user can take action if desired.  Since I removed the temperature sensor, I simplified the drying status using a 60 minute timer to approximate the length of the drying cycle. While I may add a temperature sensor in the future, the timer solution seemed adequate for an initial minimal viable product.

TinkurWash Beta Testing at Work

Is the dishwasher done and needs unloaded?
When the 60 minute timer completes, the dishwasher is assumed to be in a clean status and ready to be unloaded. Using a tilt sensor, I could determine when the dishwasher door was open. Based on timing my own usage, it seemed to take me at least a few minutes to fully unload the dishwasher. Therefore, similar to the drying state solution of using a timer, if the dishwasher door is open for more than 60 seconds it’s called beside red to be unloaded.


Is the dishwasher ready to be loaded with dirty dishes?
This one is a bit trickier. How does one measure when a dishwasher needs to be washed.  As a user, you’ll probably know that you like to run the dishwasher every night after dinner or just have packed it full of dishes and know it needs washed.  For now, TinkurWash provides visual feedback via its LED that it’s ready for dirty dishes and washing, however, it’s the users responsibility to run it. In the future, I anticipate using data collected from the sensors to determine the users patterns and proactively ask them if the dishwasher needs to be run based on those patterns.



There are a variety of ways TinkurWash can communicate with users.
TinkurWash has a large diffused RGB led so users can easily see its status from just a glance.

TinkurWash Status Colors

In addition to providing glanceable visual feedback about the status of the dishwasher, it is also integrated with our company’s chat tool – HipChat. Each of our teams had their own HipChat room in addition to a whole company chat room used for questions, news, or just chatting. TinkurWash posts messages to the company wide chat room when it starts washing or finishes drying, providing real time information that will hopefully help my coworkers take care of the dishes.


Every 30 seconds TinkurWash connects to Xively, a service for storing and retrieving IoT data.  TinkurWash stores the median vibration value, the tilt value, the dishwasher door status, the dishwasher status, and an uptime value since the last reboot.  These values are useful for debugging, are used with another integration with Zapier to post messages to HipChat, and also store the data for future use in algorithms to determine when to recommend running the dishwasher and other potentially other insights.

Learning and Pivoting

TinkurWash has had many pivots along the way – as most projects do.  While I plan to discuss some of these in more detailed standalone posts to provide lessons learned, , the following major pivots occurred:


After having issues mounting and monitoring vibration using a piezo sensor, I switched to using a 3 axis accelerometer.  A accelerometer can be used to measure movement over time, which provided to add an even greater level of granularity to the vibration instrumentation.  The accelerometer also replaced the tilt sensor, providing the ability to determine if the dishwasher door is tilted and again providing much greater granularity.  While the accelerometer ($15) is 700% the cost of a piezo ($1) and tilt ($1) sensors it’s much more reliable and is still a very acceptable costs for a hobby project.  While the outcome was much better, refactoring hardware + software is somewhat more involved than refactoring software alone.


I finally wanted to have a proper housing for this project. In the past, I’ve used boxes or storage containers as a poor mans housing for my projects.  I had cutting acrylic with my CNC machine in the past, but hadn’t made a useable housing for a project.  After having the opportunity to see the Adafruit Raspberry Pi Thermal Printer housing when assembling a printer kit, I saw how awesome an acrylic housing can be and an example of how to fabricate one.  After a few pivots of the design using cardboard in inlace of acrylic, I fabricated a slot fit acrylic housing, learning lots of lessons – and breaking a few bits – along the way.  I also hope to write another poshope to write another post on this topic in the future.


After getting TinkurWash’s sensors and code fairly reliable, I started noticing it stopped posting data anywhere from a few hours to a few days.  After reading around the internet about the CC3000 WiFi chip, I read about similar stability issues.  After trying a few different solutions, including a firmware upgrade, I ultimately implemented a watchdog timer that resets TinkurWash if a successful network connection isn’t made every 60 seconds.  Before resetting, TinkurWash stores it’s current state into non volatile EEPROM memory, reloading the state variables from EEPROM after rebooting.

Source Code


What’s Next

While I certainly have some ideas for new features to enhance the smartness of TinkurWash, including more advanced algorithms using historical data, for now my primary focus is beta testing in the office to monitor stability, learn about any unanticipated real world scenarios, and collect user feedback.

The Future of Hardware, Software, and The Physical World – Takeaways from O’Reilly Solid 2014

Last week, I attended the O’Reilly Solid Conference, focusing on solid – the intersection of hardware + software + physical things. While hardware and software have existed for decades and are mature in their own rights, it is often the intersection of disparate things that creates innovation though collisions of innovation. And so over 1,400  curious people gathered together to collide.

I wasn’t sure what to expect of conference.  I am passionate about hardware + software and the new possibilities for interactions and solutions that it enables.  And while wasn’t exactly what I thought it would be, it surprised, delighted, and taught me, leaving me hopeful and inspired in the future of solid.

While I’ll share some focused takeaways and observations in the remainder of this post, perhaps the thing that made the biggest impression on me was the diversity of the attendees and speakers. From professors to mechanical engineers to startup founders to artists to hackers to industrial designers to the just interested, solid is well poised to collide ideas to enable new and unexpected outcomes. It will be interesting to see the collisions it creates in the future, as well as who composes the community in the coming years.

After attending a diverse set of sessions and chatting with many attendees, the following takeaways standout for me:

  1. Don’t make a better solution for an unnecessary process. Focus on the user’s desired outcome and make a solution for that. Hardware + software enables new solutions that were hard or impossible previously.  Think about how things should work, not how they do work. This may mean remaking or even eliminating a process. Disruptive change.
  2. Hardware should know us. And it shouldn’t require a separate identity for each device. It should know our preferences and learn about us, no matter where we are. We need single sign on for the physical world. We should be able to interact with the whole world, not just a lightbulb in our living room.  The entire would should be a continuous interaction.
  3. Making is still too hard, but lots of people and companies are working hard to make it easier.  There is a huge complexity difference between 3D printing and CNC. Additional complexity is free in 3D printing while it’s not in CNC due to extra fixturing, rotation, etc. Also new making machines and materials are enabling new solutions like combo CAD / CAM / CNC / Laser / 3D printing all in one.
  4. Most remaining problems in the world are physical and require a physical solution, be that hardware + software / etc, to solve them. Software can partially solve it, but can’t bring a full solution to bear alone.
  5. We have tons of gadgets that don’t interact with each other. They need to be able to interact. Users want outcomes not brand specific ecosystems and proprietary closed solutions.
  6. Prototypes spark conversations. Build hardware prototypes fast, fail fast, fail often = more time to iterate and make it great. Sometimes the prototype is just a conversation starter – a way to get other people sharing their perspectives, even if they drastically change your initial direction.  Also, firmware can be used to dramatically iterate a physical product.
  7. You are the button. Hardware enables new methods of interaction. Maybe it’s where you are. Maybe it’s where you aren’t. Maybe it’s where you’re pointing. Maybe it’s who you’re with. Maybe it’s tapping one device to another. Maybe it’s flicking content from one device to another. Reimagine. Be playful. Some things will stick, the rest will be fun.
  8. Hardware can be soft. Just because something is physical doesn’t mean it must be rigid and frozen. Fabrics and flexible materials enable soft hardware, much like humans and animals with skin, mussels, and bone.  And often these are simpler to implement.
  9. Some promising tools and services:
    • Temboo: A service for orchestrating the Internet of Things, offering integrations with 100s of APIs and 1000s of API Calls.  Helps to limit the resources needed on limited resource embedded devices.
    • Mellow: A home sous vide machine for cooking food, design more like the Nest in terms of thinking about real use cases you care about.  Examples: It keeps food cold and starts it later in the day so it’s ready when you arrive home.  It asks you how you liked your meal and learn about you to better suite your needs in the future.  I love things the speak, and Mellow is a great example.
    • Node Red: A visual editor for Node.js to orchestrate interactions between the Internet of Things.
    • MQTT: A broker for publishing and subscribing to data using a lightweight protocol, limiting the resources needed on limited resource embedded devices.
    • A service for rapid hardware prototyping through easy data storage.
  10. We’ve come so far, but we are only at the beginning… Some of these things may take years or decades to mature, but the attendees of Soid are leading the way.


Better Ski Maps – Tinkurlab: SlopeStyle

Tinkurlab is going spatial!

We’re adding maps and spatial data to our visualization techniques.
Keep reading for the first foray into mapping…ski slopes!


You’ve heard it all before.

“This run would totally be a green out west.” “This resort isn’t as hard as my home mountain.” “A blue here doesn’t equal a blue there.”

Really? Are you sure? The resorts themselves state that trail ratings are subjective to that particular mountain. Green – Blue – Black – Double Black…and what are those RED trails about in Europe? Even snow conditions can change the rating of one particular trail.

Tinkurlab's collection of trail maps from the 2013/2014 season...each one different

Tinkurlab’s collection of trail maps from the 2013/2014 season…each one different

Ski trail maps are works of art; designed by a couple people to make you want to visit the resort and display the number of trails- but how much can a color tell you? If it’s the right color, turns out a whole lot.  Enter Tinkurlab:SlopeStyle.


Earlier this winter, Adam and I relocated from Washington DC to Denver, and had many more ski resorts available to us within a 2 hour drive. Resources like FreshyMap helped us determine which one to hit up with a quick glance.   With so many choices, I was curious about finding a way to objectively compare resorts and tell which parts of the resorts I wanted to ski.  In ski progression, it’s easy to both stay on terrain that you are better than or find yourself in way over your head.  I wanted a way to know what I was getting into before I took that first turn.

Tinkurlab:Slopestyle lets you visually explore the pitch and slope angle of your favorite ski runs.  With this data, you can easily compare these runs with others at the same resort to find new terrain.  Additionally, since the color-coding is the same across the maps, you can now compare resorts with each other.  Pink is pink, no matter where you are!  The site performs the same on a desktop as on a phone, allowing you to plan your runs from   any type of chair you’re sitting on.   Happy skiing!  If interested, please read further for the technical details on how the site was created.

Built by processing USGS elevation data with GIS commands to calculate and thematically render slope pitch in degrees.

What I needed to start was elevation data.  I found what I was looking for thanks to the USGS.  They publish layers in varying degrees of resolution and are available for download using The National Map.  I selected the files for the sections of Colorado and California that I required, and set about processing the data.

I used the GDAL utility as my primary GIS tool.  It’s open source, and allows the user to execute a variety of commands for translating and processing raster data.  Using the command-line operators, I was able to create greyscale hillshade layers and calculate the slope for the terrain I was interested in.  I then selected break points for the slope degrees, and generated a color-reilef map based on the pitch.

Example commands:
gdaldem hillshade -s 111120 DEM.adf hillshade.tif
gdaldem slope -s 111120 DEM.adf slopedegree.tif
gdaldem color-relief slopedegree.tif slope-ramp.txt slopeshade.tif -alpha -co ALPHA=YES
Vail Hillshade TIFF

Vail Hillshade TIFF

Vail Slopeshade TIFF

Vail Slopeshade TIFF

Ski resort data overlaid using OpenStreetMap exports stored in PostGIS databases.

The next step was to get the trail map data.  My initial plan was to find a web service containing the features and add it to my map.  That search did not yield great results, so I switched to plan B and learned how to use OpenStreetMap exports.

I used JOSM to extract the aerialway, downhill and woods features for the ski resorts included in my beta site.   Once I had the .osm data, I used osm2pgsql commands to add the features to a PostGIS extended PostgreSQL database.   I found the LearnOSM tutorial extremely helpful (I’m using a Mac with OSX Mavericks, and was able to follow along with the Windows prompts.)

Visualized using Mapbox TileMill and Mapbox Javascript API.

Once I had the data downloaded and processed, it was time to put it all together and create some maps.  I chose to use Mapbox TileMill to combine my layers and create map tiles.  The Carto CSS-like language enabled styling control, and the built in raster and PostGIS support made adding data incredibly easy.


Slopestyle TileMill Interface

The tile packages were uploaded to the Tinkurlab Mapbox account, and I set about creating the interactive web map using their javascript API.   The examples are straightforward, and with a couple lines of code, I was up and running.

SlopeStyle - Mapbox Javascript API

SlopeStyle – Mapbox Javascript API

Packaged in a responsive website.

It was important to house the webmap in code that would render correctly on any device.  You might investigate the resorts on a desktop at home, but on the mountain, you’re going to be on a phone.  The Twitter Bootstrap framework is perfectly suited for this application.  Both Tinkurlab:Maps and SlopeStyle are built using Bootstrap with some custom CSS files.

The application defaults to the I70 corridor resorts in Colorado, but the user has the option to switch to the Lake Tahoe area and back.   I chose to focus on the resorts that we skied the season, but plan on expanding in the future.  Other features in the future might include the ability to select a run to view it’s entire slope profile, as well as rendering the slope colors based on your personal ski preferences.  Keep checking back for updates!

– Val

TinkurCrate – Where’s The Dog?!


One of my favorite things about the Internet of Things is “giving a voice to things”.  And who’s more deserving of a voice – and a Twitter account – than our roommate’s dog, Tyr.  Tyr, an English Springer Spaniel enjoys running, playing with his toys, and hanging out in his crate.  In fact, he’s got a fancy crate located in a prime location in our living room.  TinkurCrate is the first in a series of projects to learn more about Tyr’s activities and connect him to the IoT.  TinkurCrate uses an Arduino with Ethernet and a proximity sensor to determine when Tyr is in his crate.  The Arduino posts the data to Xively for storage, which sends Web Hook triggers to Xively when Tyr enters and exits his crate – triggering Tweets on TinkurLab’s IO Twitter account – @TinkurLabIO.





Wiring for the project is pretty simple, just connect the proximity sensor to the Arduino Ethernet shield as follows:

  • Red to 5v
  • Black to Ground
  • Yellow to Analog Pin 5

TinkurCrate Wiring



<br />/*<br /><br />Published:  2014<br />Author:     Adam - TINKURLAB<br />Web:<br /><br />Copyright:  This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.<br /><br />Contributors:<br />-Xively code based on<br />-Median library<br /><br /><%%KEEPWHITESPACE%%> */<br /><br />#include &lt;SPI.h&gt;<br />#include &lt;Ethernet.h&gt;<br />#include "RunningMedian.h"<br /><br />#define APIKEY         "xxxxxxxxxxx" // replace your Xively API Key here<br />#define FEEDID         xxxxxxxxxxx // replace your Xively Feed ID here<br />#define USERAGENT      "TinkurCrate" // user agent is the project name<br /><br />// assign a MAC address for the ethernet controller.<br />// Newer Ethernet shields have a MAC address printed on a sticker on the shield<br />// fill in your address here:<br />byte mac[] = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }; // replace your Mac Address here<br /><br />// fill in an available IP address on your network here,<br />// for manual configuration:<br />//IPAddress ip(10,0,1,20);<br />// initialize the library instance:<br />EthernetClient client;<br /><br />char server[] = "";   // name address for xively API<br /><br />unsigned long lastConnectionTime = 0;          // last time you connected to the server, in milliseconds<br />boolean lastConnected = false;                 // state of the connection last time through the main loop<br />const unsigned long postingInterval = 10*1000; //delay between updates to<br /><br />int modeSwitch = 1;<br /><br />int incrate = 0;<br /><br />RunningMedian proximityLast10 = RunningMedian(10);<br /><br />int timesincrate = 0;<br /><br />int sensorReading = 0;<br /><br />int sensorReadingMedian = 0;<br /><br />void setup() {<br /><%%KEEPWHITESPACE%%>  // Open serial communications and wait for port to open:<br /><%%KEEPWHITESPACE%%>  Serial.begin(9600);<br /><br /><%%KEEPWHITESPACE%%>  delay(2000);<br /><br /><%%KEEPWHITESPACE%%> // Connect to network amd obtain an IP address using DHCP<br /><%%KEEPWHITESPACE%%>  if (Ethernet.begin(mac) == 0)<br /><%%KEEPWHITESPACE%%>  {<br /><%%KEEPWHITESPACE%%>    Serial.println("DHCP Failed, reset Arduino to try again");<br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><%%KEEPWHITESPACE%%>  }<br /><%%KEEPWHITESPACE%%>  else<br /><%%KEEPWHITESPACE%%>  {<br /><%%KEEPWHITESPACE%%>    Serial.println("Arduino connected to network using DHCP");<br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><%%KEEPWHITESPACE%%>  }<br />}<br /><br />void loop() {<br /><br /><%%KEEPWHITESPACE%%>  if (modeSwitch == 1)<br /><%%KEEPWHITESPACE%%>  {<br /><%%KEEPWHITESPACE%%>    // read the analog sensor:<br /><%%KEEPWHITESPACE%%>    sensorReading = analogRead(A5);<br /><br /><%%KEEPWHITESPACE%%>    proximityLast10.add(sensorReading);<br /><br /><%%KEEPWHITESPACE%%>    sensorReadingMedian = proximityLast10.getMedian();<br /><br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><br /><%%KEEPWHITESPACE%%>    Serial.print("Proximity: ");<br /><%%KEEPWHITESPACE%%>    Serial.println(sensorReading);<br /><br /><%%KEEPWHITESPACE%%>    Serial.print("Median Proximity: ");<br /><%%KEEPWHITESPACE%%>    Serial.print(sensorReadingMedian);<br /><%%KEEPWHITESPACE%%>    Serial.print(" w/ ");<br /><%%KEEPWHITESPACE%%>    Serial.print(proximityLast10.getCount());<br /><%%KEEPWHITESPACE%%>    Serial.println(" samples");<br /><br /><%%KEEPWHITESPACE%%>    delay(1000);<br /><br /><%%KEEPWHITESPACE%%>    if (sensorReadingMedian &gt; 160)<br /><%%KEEPWHITESPACE%%>    {<br /><%%KEEPWHITESPACE%%>      incrate = 1;<br /><%%KEEPWHITESPACE%%>    }<br /><br /><%%KEEPWHITESPACE%%>    if (sensorReadingMedian &lt; 100)<br /><%%KEEPWHITESPACE%%>    {<br /><%%KEEPWHITESPACE%%>      incrate = 1;<br /><%%KEEPWHITESPACE%%>    }<br /><br /><%%KEEPWHITESPACE%%>    Serial.print("Is In Crate: ");<br /><%%KEEPWHITESPACE%%>    Serial.println(incrate);<br /><br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><br /><%%KEEPWHITESPACE%%>  }<br /><br /><%%KEEPWHITESPACE%%>  // convert the data to a String<br /><%%KEEPWHITESPACE%%>  String dataString = "proximity,";<br /><%%KEEPWHITESPACE%%>  dataString += String(sensorReadingMedian);<br /><br /><%%KEEPWHITESPACE%%>  // you can append multiple readings to this String to<br /><%%KEEPWHITESPACE%%>  // send the xively feed multiple values<br /><%%KEEPWHITESPACE%%>  dataString += "\nincrate,";<br /><%%KEEPWHITESPACE%%>  dataString += String(incrate);<br /><br /><%%KEEPWHITESPACE%%>  // if there's incoming data from the net connection.<br /><%%KEEPWHITESPACE%%>  // send it out the serial port.  This is for debugging<br /><%%KEEPWHITESPACE%%>  // purposes only:<br /><%%KEEPWHITESPACE%%>  if (client.available()) {<br /><%%KEEPWHITESPACE%%>    char c =;<br /><%%KEEPWHITESPACE%%>    Serial.print(c);<br /><%%KEEPWHITESPACE%%>  }<br /><br /><%%KEEPWHITESPACE%%>  // if there's no net connection, but there was one last time<br /><%%KEEPWHITESPACE%%>  // through the loop, then stop the client:<br /><%%KEEPWHITESPACE%%>  if (!client.connected() &amp;&amp; lastConnected) {<br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><%%KEEPWHITESPACE%%>    Serial.println("disconnecting.");<br /><br /><%%KEEPWHITESPACE%%>    resetMode();<br /><br /><%%KEEPWHITESPACE%%>    client.stop();<br /><br /><%%KEEPWHITESPACE%%>  }<br /><br /><%%KEEPWHITESPACE%%>  // if you're not connected, and ten seconds have passed since<br /><%%KEEPWHITESPACE%%>  // your last connection, then connect again and send data:<br /><%%KEEPWHITESPACE%%>  if(!client.connected() &amp;&amp; (millis() - lastConnectionTime &gt; postingInterval)) {<br /><br /><%%KEEPWHITESPACE%%>    modeSwitch = 2;<br /><br /><%%KEEPWHITESPACE%%>    sendData(dataString);<br /><%%KEEPWHITESPACE%%>  }<br /><%%KEEPWHITESPACE%%>  // store the state of the connection for next time through<br /><%%KEEPWHITESPACE%%>  // the loop:<br /><%%KEEPWHITESPACE%%>  lastConnected = client.connected();<br /><br />}<br /><br />// this method makes a HTTP connection to the server:<br />void sendData(String thisData) {<br /><br /><%%KEEPWHITESPACE%%>  // if there's a successful connection:<br /><%%KEEPWHITESPACE%%>  if (client.connect(server, 80)) {<br /><%%KEEPWHITESPACE%%>    Serial.println("connecting...");<br /><%%KEEPWHITESPACE%%>    // send the HTTP PUT request:<br /><%%KEEPWHITESPACE%%>    client.print("PUT /v2/feeds/");<br /><%%KEEPWHITESPACE%%>    client.print(FEEDID);<br /><%%KEEPWHITESPACE%%>    client.println(".csv HTTP/1.0");<br /><%%KEEPWHITESPACE%%>    client.println("Host:");<br /><%%KEEPWHITESPACE%%>    client.print("X-ApiKey: ");<br /><%%KEEPWHITESPACE%%>    client.println(APIKEY);<br /><%%KEEPWHITESPACE%%>    client.print("Content-Length: ");<br /><%%KEEPWHITESPACE%%>    client.println(thisData.length());<br /><br /><%%KEEPWHITESPACE%%>    // last pieces of the HTTP PUT request:<br /><%%KEEPWHITESPACE%%>    client.println("Content-Type: text/csv");<br /><%%KEEPWHITESPACE%%>    client.println("Connection: close");<br /><%%KEEPWHITESPACE%%>    client.println();<br /><br /><%%KEEPWHITESPACE%%>    // here's the actual content of the PUT request:<br /><%%KEEPWHITESPACE%%>    client.println(thisData);<br /><%%KEEPWHITESPACE%%>    Serial.println(thisData);<br /><br /><%%KEEPWHITESPACE%%>    client.println();<br /><br /><%%KEEPWHITESPACE%%>  }<br /><%%KEEPWHITESPACE%%>  else {<br /><%%KEEPWHITESPACE%%>    // if you couldn't make a connection:<br /><%%KEEPWHITESPACE%%>    Serial.println("connection failed");<br /><%%KEEPWHITESPACE%%>    Serial.println();<br /><%%KEEPWHITESPACE%%>    Serial.println("disconnecting.");<br /><%%KEEPWHITESPACE%%>    client.stop();<br /><%%KEEPWHITESPACE%%>  }<br /><%%KEEPWHITESPACE%%>   // note the time that the connection was made or attempted:<br /><%%KEEPWHITESPACE%%>  lastConnectionTime = millis();<br />}<br /><br />void resetMode()<br />{<br /><%%KEEPWHITESPACE%%> modeSwitch = 1;<br /><%%KEEPWHITESPACE%%> incrate = 0;<br />}<br />

Source code also available on GitHub at


Xively Setup

Xively is a service for powering the Internet of Things, providing API access for data storage, data retrieval, triggers, and data charting.  After registering for a free Xively account, perform the following steps:

1) Add Device.

2) Add Channels for “incrate” and “proximity”.


3) Setup Triggers for “incrate” status change.  One Trigger calls the Zappier “Entering Crate” Zap, and the other Trigger calls the Zappier “Exiting Crate” Zap.



Zapier Setup

Zapier is a service for orchestrating and automating the Internet of Things and popular online services.  After registering for a free Zapier account, perform the following steps:

1) Create a new Zap, using a Web Hook – Catch Hook event, triggering a Twitter – Crate Tweet event.


2) Authorize your Twitter account if needed, and add a Filter to only trigger when the Xively trigger is “1” (when the dog enters the crate).


3) Setup a Message to post to Twitter and save the Zap.


You can view Tyr’s live status at and his Tweets at @TinkurLabIO.

What’s Next

While TinkurCrate is a good beginning, I hope to add a few more sensors to Tyr’s world.  Possibilities include:

  • Pressure sensors to determine when Tyr is on the sofa
  • Continuity sensor to determine when Tyr’s water bowl is empty
  • Processing visualization of Tyr’s in crate / out of crate activity by time of day and day of week

Have any other ideas or suggestions.  Tweet @TinkurLab.



TinkurBooth Photobooth

I remember the first time I visited my friend Jim at college many years ago.  Upon walking into his small two person dorm room, the first thing I noticed was all the Polaroid photos hanging around his door frame – of smiling friends and a few funny faces.  Jim had a tradition of taking a Polaroid picture when someone new visited him – a way to remember all the great people that one encounters at college.  I’ve always had a love of candid photography – photos the capture the essence of a moment or a person.  Photobooths and Polaroids are great tools for capturing candid pictures – they’re easy, quick, and produce a tangible result.

TinkurBooth Prototype

A basic prototype of TinkurBooth.

While I still love the Four Frames Photobooth, it takes a bit of time to transport, set up, and take down.  Although Four Frames Photobooth has had a busy life – attending multiple weddings, a few business parties, and even a happy hour – I wanted something the would be, well, easier, quicker, and a ‘lil bit more playful.  Meet TinkurBooth – a platform for taking quick and candid photos with endless possibilities for innovation.

TinkurBooth was my first project using a Rasberry Pi.  Many of TinkurLab’s creations have already used microcontrollers such as the Arduino.  However, I’ve been looking for an excuse to try a Raspberry Pi.  Not only are Pis cheap ($30-40) the Pi Camera module is also cheap ($30) and you can use a cheap ($10) wi-fi adapter for connectivity – < $100 for a photobooth is a pretty good deal.  And honestly, there’s just something geekily awesome about taking pictures using shell commands and switches!

“And honestly, there’s just something geekily awesome about taking pictures using shell commands and switches!”

I wanted TinkurBooth to be more then a “once and done” photobooth.  I wanted it to be a platform for experimentation and play – for trying new ideas.  Coming from my day job as an agile leader and coach, I am often reminded about the countless opportunities for experimentation – to try new things, test a hypothesis, and validate the outcome.  The TinkurBooth experience is built around four steps, which allow for many possibilities: Trigger + Interaction + Capture + Sharing.


Trigger + Interaction + Capture + Sharing

“Aren’t you going a bit too far?  It’s just a photobooth!” is what you’re probably thinking right now.  Let me explain.  Thinking of each of these steps as separate but related, helps me think about the possibilities of changing one or more of the steps to create a different experience that engages different people in different ways.  For example, one day the TinkurBooth could use a motion trigger while another day it could use a sound trigger that listens for a clap.  As another example, TinkurBooth could capture a single black and white photo one day while another day it could take four photos and merge them into an animated GIF.  In fact, my goal for the next year is to have a new version of TinkurBooth every month or two to see how each variation changes the user experience.



This step is about the system knowing the user wants to interact with it (or convincing the user to interact with it).  Examples of triggers could include:

  • Button: Pressing a button

  • Motion: Passing within a certain area (width and depth)

  • Sound: A loud sound or a certain type of sound (ex. a clap or whistle)

  • Light: A dramatic change in light, indicating something has changed

  • Distance and Movement: Coming within a certain distance

  • Time: A certain amount of elapsed time

  • Proximity: Being within a certain distance, although not necessarily within visual proximity (ex. think about using geo location)

  • Something You Have: Needing a smartphone to interact with it

  • Pattern: Press a button in a certain pattern

And let’s not forget about the possibility of mashups such as Distance + Time.  For example, an early prototype of TinkurBooth required the user to be standing within a 2 foot area before it would take a picture.  Too close and it would yell at you to stand back.  Too far away and it would start to worry and ask you to come back.  Really far away and it would ask you to come play.  And if you stood in just the right spot for two seconds, with would take your picture.



While the trigger could directly move to the capture step, there’s a great opportunity for interaction at this point to engage and delight the user.  Examples of interaction could include:


  • Engage: In the Distance + Time example above, the photobooth is playful, using an LCD screen with text based on distance and time to engage the user.

  • Inform: The LCD screen could also be used to tell the user how to interact with the photobooth.  In the above example, before starting taking pictures, the photobooth asked the user to make a funny face and then showed the user a random word such as “crazy” just before taking the picture.

  • Give and Take: Interaction could also consist of the user providing something, such as their Twitter username, in return for the photobooth taking a picture.  The photobooth could then use the Twitter username to tag the photo.



The photography step of the equation.  Examples of capture could include:

  • Timing: The time between pictures or a random time

  • Number of Pictures: The number of pictures taken or a random number

  • Shape of Pictures: Vertical, horizontal, square

  • Filters: Color, B&W, think Instagram

  • Lighting: Flash, No Flash, Ringflash

  • Post Processing: Keep pictures separate, merge photos into a 4 x 1 strip, merge photos into an animated GIF



One of the greatest challenges with photography is doing something with all those pictures.  Sharing is the part of the equation that allows people to share and remember moments.  Examples of sharing could include:

  • Twitter: Posting to Twitter

  • Printing: Printing a photostrip

  • Emailing: Emailing to the user



However, before experimenting with variations, I had to create a working prototype.  There are lots of great tutorials and troubleshooting posts all over the Internet, so I won’t provide step by step instructions here.  However, if you have specific questions, feel free to contact me.

One of the first animated GIF examples posted to Tumblr by TinkurBooth.

How It Works

The following is an overview of the TinkurBooth Platform workflow:

  1. Run sudo python script

  2. Script monitors for motion

  3. When motion is detected, four pictures are taken using the raspistill command and saved locally; code based on

  4. The four photos are merged together using ImageMagick and saved locally as an animated GIF

  5. The annimated GIF is uploaded to a Gmail account; code based on

  6. The If Then Then That (IFTTT) service monitors the Gmail account and runs macros to post the animated GIF to Tumblr (and thanks for Tumblr for being so awesome as to actually support annimated GIFs!)





  1. Follow the awesome Adafruit Raspberry Pi Tutorials

    1. Prepare SD Card and Install Raspbian OS ://

    2. Configure Pi

    3. Configure Network

    4. Configure VNC (so you can program from yoru desktop or laptop)

    5. Setup GPIO (General Purpose Input Output) Libraries

    6. Trying Sensing Some Movement

  2. Setup and configure the Raspberry Pi Camera
  3. Download the TinkurBooth source code from GitHub at


Helpful Tutorials


What’s Next

Stay tuned for future posts about TinkurBooth.  Forth the 1st “TinkurBooth” of the month, I’m going to be creating a version that is activated by distance and uses an LCD screen to interact with the user asking them to play a game to act out a event that will be turned into an animated GIF.

And if you’re wondering about that Polaroid from Jim’s dorm room, here’s it is – Jim, Adam, and Val.

Jim Adam and Val Polaroid


Tagged , ,

Raspberry Pi Camera Command List

Now that I got my Raspberry Pi’s Camera Module working, I’m sure I’ll need to keep a list of the software commands handy.  The following is the list of commands supported by v1.1 for the RaspiStill Camera App:

Use from a command line.  Usage: raspistill [options]

Image Parameter Commands

-?, –help : This help information
-w, –width : Set image width <size>
-h, –height : Set image height <size>
-q, –quality : Set jpeg quality <0 to 100>
-r, –raw : Add raw bayer data to jpeg metadata
-o, –output : Output filename <filename> (to write to stdout, use ‘-o -‘). If not specified, no file is saved
-v, –verbose : Output verbose information during run
-t, –timeout : Time (in ms) before takes picture and shuts down (if not specified, set to 5s)
-th, –thumb : Set thumbnail parameters (x:y:quality)
-d, –demo : Run a demo mode (cycle through range of camera options, no capture)
-e, –encoding : Encoding to use for output file (jpg, bmp, gif, png)
-x, –exif : EXIF tag to apply to captures (format as ‘key=value’)
-tl, –timelapse : Timelapse mode. Takes a picture every <t>ms

Preview Parameter Commands

-p, –preview : Preview window settings <‘x,y,w,h’>
-f, –fullscreen : Fullscreen preview mode
-op, –opacity : Preview window opacity (0-255)
-n, –nopreview : Do not display a preview window

Image Parameter Commands

-sh, –sharpness : Set image sharpness (-100 to 100)
-co, –contrast : Set image contrast (-100 to 100)
-br, –brightness : Set image brightness (0 to 100)
-sa, –saturation : Set image saturation (-100 to 100)
-ISO, –ISO : Set capture ISO
-vs, –vstab : Turn on video stablisation
-ev, –ev : Set EV compensation
-ex, –exposure : Set exposure mode (see Notes)
-awb, –awb : Set AWB mode (see Notes)
-ifx, –imxfx : Set image effect (see Notes)
-cfx, –colfx : Set colour effect (U:V)
-mm, –metering : Set metering mode (see Notes)
-rot, –rotation : Set image rotation (0-359)
-hf, –hflip : Set horizontal flip
-vf, –vflip : Set vertical flip

Exposure mode options :

AWB mode options :

Image Effect mode options :

Metering Mode options :

Tagged , , ,

Sticky Notes – Awesome for Brainstorming and Collaboration

There’s a stereotype out there – that agile teams use a lot of sticky notes.  Well guess what – it’s true!  If you walk around the workspace of almost any agile team, you are likely to see countless colorful sticky notes on walls and windows everywhere.  But why?  In my experience, sticky notes are one of the most powerful and versatile tools for collaboration I’ve ever used – including software tools.  But I’ve also learned it can take a bit of convincing and learning by doing to help others understand the “Power of Post-Its“.  Through a series of blog posts, I hope to turn you into a believer too!

Sticky Notes

As an agile practitioner and coach, I am often facilitating collaborative sessions – to brainstorm, to prioritize, to turn a vision into a plan, to collect feedback, to estimate, to sync on progress.  And my number one tool of choice are sticky notes.  The seemingly simple pieces of paper are a powerful tool to get the whole team to participate and rapidly share their knowledge and perspectives while collaborating towards a shared goal.

While I’ll discuss how to use sticky notes for many different types of collaboration in future posts, let’s start by looking at the many benefits of sticky notes as a tool:


Capturing a thought on a sticky note is quick.  Just grab a pen or Sharpie marker – my favorite writing implement – and jot down your thought in a few words.  Great!  Now write down another.  And another.  Awesome – you’re rapidly brainstorming!

Compact and Concise

With limited time and attention it’s often important to distill the most important facts down to a few meaningful words or sentences – or a picture.  Sticky notes have a finite amount of space, requiring each note to contain a simple and focused idea.  Can’t fit it all one sticky note?  Good – use a second and third, making sure each item is focused.   I bet you didn’t think less space was a good thing!

Divide and Conquer

Sticky notes allow a whole team of people to brainstorm independently at the same time.  Just ask each person to write down as many ideas as they can within a 5 to 10 minute time box.  Don’t worry about evaluating, discussing, or prioritizing sticky notes yet.  Brainstorming is all about unfiltered idea anyway.  And by asking everyone to work in parallel without discussion, you’ll get lots of ideas written down.

The Sticky Factor

Sticky notes stick to just about anything – walls, windows, desks – especially the Super Sticky Post-It Notes which I highly recommend.  When I’m facilitating collaborative sessions, I often use painter’s or artist’s tape to create an appropriate set of buckets on the wall and then ask team members to add their sticky notes to each area.  For example, while facilitating an team retrospective to determine how to improve the team and their process, I’ll make buckets such as “do more of”, “do less of”, “what if…”.


One of my favorite reasons for using sticky notes is that they’re colorful and visible. By putting your sticky notes in a public space such as an office wall, you and your team are constantly reminded of the sticky notes and encourage others to check out what’s happening.  It’s a great way to radiate information and invite other to participate in collaboration.


You can touch a sticky note.  It’s real.  When you’re discussing it, you can pick it up so others know which thing you’re talking about.  When you’ve made progress, you can move it from an in progress bucket to a done bucket.  When you’re done done with it you can tear it up and throw it away – my favorite!

Simple or Complex

Just like software tools, sticky notes can support just about any process and metadata you can invent. Want to track new features vs. defect – use different colored sticky notes. Want to track how many days something has been in progress – add a dot to the lower left corner of the sticky note each day. Want to track who’s working on what – give each person a unique sticker to add to sticky notes.

Repositionable and Groupable

Combined with a wall or window and some painter’s tape to create some buckets, sticky notes make a great system to track just about anything.  And best of all – they’re built to be moved around over and over and over.  Try reconfiguring your software tools to match your process that quickly!

While I also use many other agile tools, sticky notes are one of my favorite.  Try it out yourself and see if you become a believer!  Or check out some other team’s sticky notes.


Tagged , , , , , , ,