Demo of my hackathon to create a Jenkins Build Watcher for Hubot that watches an active build in Jenkins and provide updates when the build completes.
Demo of my hackathon to create a Jenkins Build Watcher for Hubot that watches an active build in Jenkins and provide updates when the build completes.
I haven’t shared many new projects recently, but that’s because I’m been hacking at work – at CA Technologies (formerly Rally Software). Every 3 months, we take a week to hack on something that we feel is valuable to the company. The following are a few of my most recent hacks. Enjoy!
Demo of my hackathon to create an Out of Office for Hubot that lets people know when you’re out of the office or busy:
Source code available at https://github.com/adamzolyak/hubot-custom-outofoffice
Hubot script available at https://www.npmjs.com/package/hubot-custom-outofoffice
https://github.com/TinkurLab/TinkurWash/
Last week, I attended the O’Reilly Solid Conference, focusing on solid – the intersection of hardware + software + physical things. While hardware and software have existed for decades and are mature in their own rights, it is often the intersection of disparate things that creates innovation though collisions of innovation. And so over 1,400 curious people gathered together to collide.
I wasn’t sure what to expect of conference. I am passionate about hardware + software and the new possibilities for interactions and solutions that it enables. And while wasn’t exactly what I thought it would be, it surprised, delighted, and taught me, leaving me hopeful and inspired in the future of solid.
While I’ll share some focused takeaways and observations in the remainder of this post, perhaps the thing that made the biggest impression on me was the diversity of the attendees and speakers. From professors to mechanical engineers to startup founders to artists to hackers to industrial designers to the just interested, solid is well poised to collide ideas to enable new and unexpected outcomes. It will be interesting to see the collisions it creates in the future, as well as who composes the community in the coming years.
After attending a diverse set of sessions and chatting with many attendees, the following takeaways standout for me:
-Adam
One of my favorite things about the Internet of Things is “giving a voice to things”. And who’s more deserving of a voice – and a Twitter account – than our roommate’s dog, Tyr. Tyr, an English Springer Spaniel enjoys running, playing with his toys, and hanging out in his crate. In fact, he’s got a fancy crate located in a prime location in our living room. TinkurCrate is the first in a series of projects to learn more about Tyr’s activities and connect him to the IoT. TinkurCrate uses an Arduino with Ethernet and a proximity sensor to determine when Tyr is in his crate. The Arduino posts the data to Xively for storage, which sends Web Hook triggers to Xively when Tyr enters and exits his crate – triggering Tweets on TinkurLab’s IO Twitter account – @TinkurLabIO.
Wiring for the project is pretty simple, just connect the proximity sensor to the Arduino Ethernet shield as follows:
<br />/*<br /><br />Published: 2014<br />Author: Adam - TINKURLAB<br />Web: www.TinkurLab.com<br /><br />Copyright: This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. http://creativecommons.org/licenses/by-nc/4.0/<br /><br />Contributors:<br />-Xively code based on http://arduino.cc/en/Tutorial/XivelyClient<br />-Median library http://playground.arduino.cc/Main/RunningMedian<br /><br /><%%KEEPWHITESPACE%%> */<br /><br />#include <SPI.h><br />#include <Ethernet.h><br />#include "RunningMedian.h"<br /><br />#define APIKEY "xxxxxxxxxxx" // replace your Xively API Key here<br />#define FEEDID xxxxxxxxxxx // replace your Xively Feed ID here<br />#define USERAGENT "TinkurCrate" // user agent is the project name<br /><br />// assign a MAC address for the ethernet controller.<br />// Newer Ethernet shields have a MAC address printed on a sticker on the shield<br />// fill in your address here:<br />byte mac[] = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }; // replace your Mac Address here<br /><br />// fill in an available IP address on your network here,<br />// for manual configuration:<br />//IPAddress ip(10,0,1,20);<br />// initialize the library instance:<br />EthernetClient client;<br /><br />char server[] = "api.xively.com"; // name address for xively API<br /><br />unsigned long lastConnectionTime = 0; // last time you connected to the server, in milliseconds<br />boolean lastConnected = false; // state of the connection last time through the main loop<br />const unsigned long postingInterval = 10*1000; //delay between updates to Xively.com<br /><br />int modeSwitch = 1;<br /><br />int incrate = 0;<br /><br />RunningMedian proximityLast10 = RunningMedian(10);<br /><br />int timesincrate = 0;<br /><br />int sensorReading = 0;<br /><br />int sensorReadingMedian = 0;<br /><br />void setup() {<br /><%%KEEPWHITESPACE%%> // Open serial communications and wait for port to open:<br /><%%KEEPWHITESPACE%%> Serial.begin(9600);<br /><br /><%%KEEPWHITESPACE%%> delay(2000);<br /><br /><%%KEEPWHITESPACE%%> // Connect to network amd obtain an IP address using DHCP<br /><%%KEEPWHITESPACE%%> if (Ethernet.begin(mac) == 0)<br /><%%KEEPWHITESPACE%%> {<br /><%%KEEPWHITESPACE%%> Serial.println("DHCP Failed, reset Arduino to try again");<br /><%%KEEPWHITESPACE%%> Serial.println();<br /><%%KEEPWHITESPACE%%> }<br /><%%KEEPWHITESPACE%%> else<br /><%%KEEPWHITESPACE%%> {<br /><%%KEEPWHITESPACE%%> Serial.println("Arduino connected to network using DHCP");<br /><%%KEEPWHITESPACE%%> Serial.println();<br /><%%KEEPWHITESPACE%%> }<br />}<br /><br />void loop() {<br /><br /><%%KEEPWHITESPACE%%> if (modeSwitch == 1)<br /><%%KEEPWHITESPACE%%> {<br /><%%KEEPWHITESPACE%%> // read the analog sensor:<br /><%%KEEPWHITESPACE%%> sensorReading = analogRead(A5);<br /><br /><%%KEEPWHITESPACE%%> proximityLast10.add(sensorReading);<br /><br /><%%KEEPWHITESPACE%%> sensorReadingMedian = proximityLast10.getMedian();<br /><br /><%%KEEPWHITESPACE%%> Serial.println();<br /><br /><%%KEEPWHITESPACE%%> Serial.print("Proximity: ");<br /><%%KEEPWHITESPACE%%> Serial.println(sensorReading);<br /><br /><%%KEEPWHITESPACE%%> Serial.print("Median Proximity: ");<br /><%%KEEPWHITESPACE%%> Serial.print(sensorReadingMedian);<br /><%%KEEPWHITESPACE%%> Serial.print(" w/ ");<br /><%%KEEPWHITESPACE%%> Serial.print(proximityLast10.getCount());<br /><%%KEEPWHITESPACE%%> Serial.println(" samples");<br /><br /><%%KEEPWHITESPACE%%> delay(1000);<br /><br /><%%KEEPWHITESPACE%%> if (sensorReadingMedian > 160)<br /><%%KEEPWHITESPACE%%> {<br /><%%KEEPWHITESPACE%%> incrate = 1;<br /><%%KEEPWHITESPACE%%> }<br /><br /><%%KEEPWHITESPACE%%> if (sensorReadingMedian < 100)<br /><%%KEEPWHITESPACE%%> {<br /><%%KEEPWHITESPACE%%> incrate = 1;<br /><%%KEEPWHITESPACE%%> }<br /><br /><%%KEEPWHITESPACE%%> Serial.print("Is In Crate: ");<br /><%%KEEPWHITESPACE%%> Serial.println(incrate);<br /><br /><%%KEEPWHITESPACE%%> Serial.println();<br /><br /><%%KEEPWHITESPACE%%> }<br /><br /><%%KEEPWHITESPACE%%> // convert the data to a String<br /><%%KEEPWHITESPACE%%> String dataString = "proximity,";<br /><%%KEEPWHITESPACE%%> dataString += String(sensorReadingMedian);<br /><br /><%%KEEPWHITESPACE%%> // you can append multiple readings to this String to<br /><%%KEEPWHITESPACE%%> // send the xively feed multiple values<br /><%%KEEPWHITESPACE%%> dataString += "\nincrate,";<br /><%%KEEPWHITESPACE%%> dataString += String(incrate);<br /><br /><%%KEEPWHITESPACE%%> // if there's incoming data from the net connection.<br /><%%KEEPWHITESPACE%%> // send it out the serial port. This is for debugging<br /><%%KEEPWHITESPACE%%> // purposes only:<br /><%%KEEPWHITESPACE%%> if (client.available()) {<br /><%%KEEPWHITESPACE%%> char c = client.read();<br /><%%KEEPWHITESPACE%%> Serial.print(c);<br /><%%KEEPWHITESPACE%%> }<br /><br /><%%KEEPWHITESPACE%%> // if there's no net connection, but there was one last time<br /><%%KEEPWHITESPACE%%> // through the loop, then stop the client:<br /><%%KEEPWHITESPACE%%> if (!client.connected() && lastConnected) {<br /><%%KEEPWHITESPACE%%> Serial.println();<br /><%%KEEPWHITESPACE%%> Serial.println("disconnecting.");<br /><br /><%%KEEPWHITESPACE%%> resetMode();<br /><br /><%%KEEPWHITESPACE%%> client.stop();<br /><br /><%%KEEPWHITESPACE%%> }<br /><br /><%%KEEPWHITESPACE%%> // if you're not connected, and ten seconds have passed since<br /><%%KEEPWHITESPACE%%> // your last connection, then connect again and send data:<br /><%%KEEPWHITESPACE%%> if(!client.connected() && (millis() - lastConnectionTime > postingInterval)) {<br /><br /><%%KEEPWHITESPACE%%> modeSwitch = 2;<br /><br /><%%KEEPWHITESPACE%%> sendData(dataString);<br /><%%KEEPWHITESPACE%%> }<br /><%%KEEPWHITESPACE%%> // store the state of the connection for next time through<br /><%%KEEPWHITESPACE%%> // the loop:<br /><%%KEEPWHITESPACE%%> lastConnected = client.connected();<br /><br />}<br /><br />// this method makes a HTTP connection to the server:<br />void sendData(String thisData) {<br /><br /><%%KEEPWHITESPACE%%> // if there's a successful connection:<br /><%%KEEPWHITESPACE%%> if (client.connect(server, 80)) {<br /><%%KEEPWHITESPACE%%> Serial.println("connecting...");<br /><%%KEEPWHITESPACE%%> // send the HTTP PUT request:<br /><%%KEEPWHITESPACE%%> client.print("PUT /v2/feeds/");<br /><%%KEEPWHITESPACE%%> client.print(FEEDID);<br /><%%KEEPWHITESPACE%%> client.println(".csv HTTP/1.0");<br /><%%KEEPWHITESPACE%%> client.println("Host: api.xively.com");<br /><%%KEEPWHITESPACE%%> client.print("X-ApiKey: ");<br /><%%KEEPWHITESPACE%%> client.println(APIKEY);<br /><%%KEEPWHITESPACE%%> client.print("Content-Length: ");<br /><%%KEEPWHITESPACE%%> client.println(thisData.length());<br /><br /><%%KEEPWHITESPACE%%> // last pieces of the HTTP PUT request:<br /><%%KEEPWHITESPACE%%> client.println("Content-Type: text/csv");<br /><%%KEEPWHITESPACE%%> client.println("Connection: close");<br /><%%KEEPWHITESPACE%%> client.println();<br /><br /><%%KEEPWHITESPACE%%> // here's the actual content of the PUT request:<br /><%%KEEPWHITESPACE%%> client.println(thisData);<br /><%%KEEPWHITESPACE%%> Serial.println(thisData);<br /><br /><%%KEEPWHITESPACE%%> client.println();<br /><br /><%%KEEPWHITESPACE%%> }<br /><%%KEEPWHITESPACE%%> else {<br /><%%KEEPWHITESPACE%%> // if you couldn't make a connection:<br /><%%KEEPWHITESPACE%%> Serial.println("connection failed");<br /><%%KEEPWHITESPACE%%> Serial.println();<br /><%%KEEPWHITESPACE%%> Serial.println("disconnecting.");<br /><%%KEEPWHITESPACE%%> client.stop();<br /><%%KEEPWHITESPACE%%> }<br /><%%KEEPWHITESPACE%%> // note the time that the connection was made or attempted:<br /><%%KEEPWHITESPACE%%> lastConnectionTime = millis();<br />}<br /><br />void resetMode()<br />{<br /><%%KEEPWHITESPACE%%> modeSwitch = 1;<br /><%%KEEPWHITESPACE%%> incrate = 0;<br />}<br />Source code also available on GitHub at https://github.com/TinkurLab/TinkurCrate/.
Xively Setup
Xively is a service for powering the Internet of Things, providing API access for data storage, data retrieval, triggers, and data charting. After registering for a free Xively account, perform the following steps:
1) Add Device.
2) Add Channels for “incrate” and “proximity”.
3) Setup Triggers for “incrate” status change. One Trigger calls the Zappier “Entering Crate” Zap, and the other Trigger calls the Zappier “Exiting Crate” Zap.
Zapier Setup
Zapier is a service for orchestrating and automating the Internet of Things and popular online services. After registering for a free Zapier account, perform the following steps:
1) Create a new Zap, using a Web Hook – Catch Hook event, triggering a Twitter – Crate Tweet event.
2) Authorize your Twitter account if needed, and add a Filter to only trigger when the Xively trigger is “1” (when the dog enters the crate).
3) Setup a Message to post to Twitter and save the Zap.
You can view Tyr’s live status at https://xively.com/feeds/1765156749 and his Tweets at @TinkurLabIO.
What’s Next
While TinkurCrate is a good beginning, I hope to add a few more sensors to Tyr’s world. Possibilities include:
Have any other ideas or suggestions. Tweet @TinkurLab.
-Adam
I remember the first time I visited my friend Jim at college many years ago. Upon walking into his small two person dorm room, the first thing I noticed was all the Polaroid photos hanging around his door frame – of smiling friends and a few funny faces. Jim had a tradition of taking a Polaroid picture when someone new visited him – a way to remember all the great people that one encounters at college. I’ve always had a love of candid photography – photos the capture the essence of a moment or a person. Photobooths and Polaroids are great tools for capturing candid pictures – they’re easy, quick, and produce a tangible result.
While I still love the Four Frames Photobooth, it takes a bit of time to transport, set up, and take down. Although Four Frames Photobooth has had a busy life – attending multiple weddings, a few business parties, and even a happy hour – I wanted something the would be, well, easier, quicker, and a ‘lil bit more playful. Meet TinkurBooth – a platform for taking quick and candid photos with endless possibilities for innovation.
TinkurBooth was my first project using a Rasberry Pi. Many of TinkurLab’s creations have already used microcontrollers such as the Arduino. However, I’ve been looking for an excuse to try a Raspberry Pi. Not only are Pis cheap ($30-40) the Pi Camera module is also cheap ($30) and you can use a cheap ($10) wi-fi adapter for connectivity – < $100 for a photobooth is a pretty good deal. And honestly, there’s just something geekily awesome about taking pictures using shell commands and switches!
“And honestly, there’s just something geekily awesome about taking pictures using shell commands and switches!”
I wanted TinkurBooth to be more then a “once and done” photobooth. I wanted it to be a platform for experimentation and play – for trying new ideas. Coming from my day job as an agile leader and coach, I am often reminded about the countless opportunities for experimentation – to try new things, test a hypothesis, and validate the outcome. The TinkurBooth experience is built around four steps, which allow for many possibilities: Trigger + Interaction + Capture + Sharing.
“Aren’t you going a bit too far? It’s just a photobooth!” is what you’re probably thinking right now. Let me explain. Thinking of each of these steps as separate but related, helps me think about the possibilities of changing one or more of the steps to create a different experience that engages different people in different ways. For example, one day the TinkurBooth could use a motion trigger while another day it could use a sound trigger that listens for a clap. As another example, TinkurBooth could capture a single black and white photo one day while another day it could take four photos and merge them into an animated GIF. In fact, my goal for the next year is to have a new version of TinkurBooth every month or two to see how each variation changes the user experience.
This step is about the system knowing the user wants to interact with it (or convincing the user to interact with it). Examples of triggers could include:
Button: Pressing a button
Motion: Passing within a certain area (width and depth)
Sound: A loud sound or a certain type of sound (ex. a clap or whistle)
Light: A dramatic change in light, indicating something has changed
Distance and Movement: Coming within a certain distance
Time: A certain amount of elapsed time
Proximity: Being within a certain distance, although not necessarily within visual proximity (ex. think about using geo location)
Something You Have: Needing a smartphone to interact with it
Pattern: Press a button in a certain pattern
…
And let’s not forget about the possibility of mashups such as Distance + Time. For example, an early prototype of TinkurBooth required the user to be standing within a 2 foot area before it would take a picture. Too close and it would yell at you to stand back. Too far away and it would start to worry and ask you to come back. Really far away and it would ask you to come play. And if you stood in just the right spot for two seconds, with would take your picture.
While the trigger could directly move to the capture step, there’s a great opportunity for interaction at this point to engage and delight the user. Examples of interaction could include:
Engage: In the Distance + Time example above, the photobooth is playful, using an LCD screen with text based on distance and time to engage the user.
Inform: The LCD screen could also be used to tell the user how to interact with the photobooth. In the above example, before starting taking pictures, the photobooth asked the user to make a funny face and then showed the user a random word such as “crazy” just before taking the picture.
Give and Take: Interaction could also consist of the user providing something, such as their Twitter username, in return for the photobooth taking a picture. The photobooth could then use the Twitter username to tag the photo.
…
The photography step of the equation. Examples of capture could include:
Timing: The time between pictures or a random time
Number of Pictures: The number of pictures taken or a random number
Shape of Pictures: Vertical, horizontal, square
Filters: Color, B&W, think Instagram
Lighting: Flash, No Flash, Ringflash
Post Processing: Keep pictures separate, merge photos into a 4 x 1 strip, merge photos into an animated GIF
…
One of the greatest challenges with photography is doing something with all those pictures. Sharing is the part of the equation that allows people to share and remember moments. Examples of sharing could include:
Twitter: Posting to Twitter
Printing: Printing a photostrip
Emailing: Emailing to the user
…
However, before experimenting with variations, I had to create a working prototype. There are lots of great tutorials and troubleshooting posts all over the Internet, so I won’t provide step by step instructions here. However, if you have specific questions, feel free to contact me.
One of the first animated GIF examples posted to Tumblr http://tinkurbooth.tumblr.com/ by TinkurBooth.
The following is an overview of the TinkurBooth Platform workflow:
Run sudo python boothsnap.py script
Script monitors for motion
When motion is detected, four pictures are taken using the raspistill command and saved locally; code based on https://gist.github.com/benhosmer/5653641
The four photos are merged together using ImageMagick and saved locally as an animated GIF
The annimated GIF is uploaded to a Gmail account; code based on http://mitchtech.net/connect-raspberry-pi-to-gmail-facebook-twitter-more/
The If Then Then That (IFTTT) service monitors the Gmail account and runs macros to post the animated GIF to Tumblr (and thanks for Tumblr for being so awesome as to actually support annimated GIFs!)
Raspberry Pi Model B
http://www.adafruit.com/products/998
Raspberry Pi Camera Board
http://www.adafruit.com/products/1367
Raspberry Pi Cobbler
http://www.adafruit.com/products/1105
PIR Motion Sensor
http://www.adafruit.com/products/189
32GB SD Card
Latest Raspbian OS Image Download
http://www.raspbian.org/
Follow the awesome Adafruit Raspberry Pi Tutorials
Prepare SD Card and Install Raspbian OS ://learn.adafruit.com/adafruit-raspberry-pi-lesson-1-preparing-and-sd-card-for-your-raspberry-pi
Configure Pi http://learn.adafruit.com/adafruits-raspberry-pi-lesson-2-first-time-configuration
Configure Network http://learn.adafruit.com/adafruits-raspberry-pi-lesson-3-network-setup
Configure VNC (so you can program from yoru desktop or laptop) http://learn.adafruit.com/adafruit-raspberry-pi-lesson-7-remote-control-with-vnc
Setup GPIO (General Purpose Input Output) Libraries http://learn.adafruit.com/adafruits-raspberry-pi-lesson-4-gpio-setup
Trying Sensing Some Movement http://learn.adafruit.com/adafruits-raspberry-pi-lesson-12-sensing-movement
Raspberry Pi Setup Checklist
http://raspberrypi.werquin.com/post/43645313109/initial-setup-of-the-raspberry-pi-out-of-the-box
Raspberry Pi Camera Setup and Configuration and Setup
http://www.raspberrypi.org/camera
Code: Connecting Raspberry Pi to Gmail and IFTTT
http://mitchtech.net/connect-raspberry-pi-to-gmail-facebook-twitter-more/
Code: Raspberry Pi Timed Capture
https://gist.github.com/benhosmer/5653641
ImageMagick Documentation
http://www.imagemagick.org/Usage/
Stay tuned for future posts about TinkurBooth. Forth the 1st “TinkurBooth” of the month, I’m going to be creating a version that is activated by distance and uses an LCD screen to interact with the user asking them to play a game to act out a event that will be turned into an animated GIF.
And if you’re wondering about that Polaroid from Jim’s dorm room, here’s it is – Jim, Adam, and Val.
-Adam
Now that I got my Raspberry Pi’s Camera Module working, I’m sure I’ll need to keep a list of the software commands handy. The following is the list of commands supported by v1.1 for the RaspiStill Camera App:
Use from a command line. Usage: raspistill [options]
Image Parameter Commands
-?, –help : This help information
-w, –width : Set image width <size>
-h, –height : Set image height <size>
-q, –quality : Set jpeg quality <0 to 100>
-r, –raw : Add raw bayer data to jpeg metadata
-o, –output : Output filename <filename> (to write to stdout, use ‘-o -‘). If not specified, no file is saved
-v, –verbose : Output verbose information during run
-t, –timeout : Time (in ms) before takes picture and shuts down (if not specified, set to 5s)
-th, –thumb : Set thumbnail parameters (x:y:quality)
-d, –demo : Run a demo mode (cycle through range of camera options, no capture)
-e, –encoding : Encoding to use for output file (jpg, bmp, gif, png)
-x, –exif : EXIF tag to apply to captures (format as ‘key=value’)
-tl, –timelapse : Timelapse mode. Takes a picture every <t>ms
Preview Parameter Commands
-p, –preview : Preview window settings <‘x,y,w,h’>
-f, –fullscreen : Fullscreen preview mode
-op, –opacity : Preview window opacity (0-255)
-n, –nopreview : Do not display a preview window
Image Parameter Commands
-sh, –sharpness : Set image sharpness (-100 to 100)
-co, –contrast : Set image contrast (-100 to 100)
-br, –brightness : Set image brightness (0 to 100)
-sa, –saturation : Set image saturation (-100 to 100)
-ISO, –ISO : Set capture ISO
-vs, –vstab : Turn on video stablisation
-ev, –ev : Set EV compensation
-ex, –exposure : Set exposure mode (see Notes)
-awb, –awb : Set AWB mode (see Notes)
-ifx, –imxfx : Set image effect (see Notes)
-cfx, –colfx : Set colour effect (U:V)
-mm, –metering : Set metering mode (see Notes)
-rot, –rotation : Set image rotation (0-359)
-hf, –hflip : Set horizontal flip
-vf, –vflip : Set vertical flip
Exposure mode options :
off,auto,night,nightpreview,backlight,spotlight,sports,snow,beach,verylong,fixedfps,antishake,fireworks
AWB mode options :
off,auto,sun,cloud,shade,tungsten,fluorescent,incandescent,flash,horizon
Image Effect mode options :
none,negative,solarise,sketch,denoise,emboss,oilpaint,hatch,gpen,pastel,watercolour,film,blur,saturation,colourswap,washedout,posterise,colourpoint,colourbalance,cartoon
Metering Mode options :
average,spot,backlit,matrix
There’s a stereotype out there – that agile teams use a lot of sticky notes. Well guess what – it’s true! If you walk around the workspace of almost any agile team, you are likely to see countless colorful sticky notes on walls and windows everywhere. But why? In my experience, sticky notes are one of the most powerful and versatile tools for collaboration I’ve ever used – including software tools. But I’ve also learned it can take a bit of convincing and learning by doing to help others understand the “Power of Post-Its“. Through a series of blog posts, I hope to turn you into a believer too!
As an agile practitioner and coach, I am often facilitating collaborative sessions – to brainstorm, to prioritize, to turn a vision into a plan, to collect feedback, to estimate, to sync on progress. And my number one tool of choice are sticky notes. The seemingly simple pieces of paper are a powerful tool to get the whole team to participate and rapidly share their knowledge and perspectives while collaborating towards a shared goal.
While I’ll discuss how to use sticky notes for many different types of collaboration in future posts, let’s start by looking at the many benefits of sticky notes as a tool:
Quick
Capturing a thought on a sticky note is quick. Just grab a pen or Sharpie marker – my favorite writing implement – and jot down your thought in a few words. Great! Now write down another. And another. Awesome – you’re rapidly brainstorming!
Compact and Concise
With limited time and attention it’s often important to distill the most important facts down to a few meaningful words or sentences – or a picture. Sticky notes have a finite amount of space, requiring each note to contain a simple and focused idea. Can’t fit it all one sticky note? Good – use a second and third, making sure each item is focused. I bet you didn’t think less space was a good thing!
Divide and Conquer
Sticky notes allow a whole team of people to brainstorm independently at the same time. Just ask each person to write down as many ideas as they can within a 5 to 10 minute time box. Don’t worry about evaluating, discussing, or prioritizing sticky notes yet. Brainstorming is all about unfiltered idea anyway. And by asking everyone to work in parallel without discussion, you’ll get lots of ideas written down.
The Sticky Factor
Sticky notes stick to just about anything – walls, windows, desks – especially the Super Sticky Post-It Notes which I highly recommend. When I’m facilitating collaborative sessions, I often use painter’s or artist’s tape to create an appropriate set of buckets on the wall and then ask team members to add their sticky notes to each area. For example, while facilitating an team retrospective to determine how to improve the team and their process, I’ll make buckets such as “do more of”, “do less of”, “what if…”.
Visible
One of my favorite reasons for using sticky notes is that they’re colorful and visible. By putting your sticky notes in a public space such as an office wall, you and your team are constantly reminded of the sticky notes and encourage others to check out what’s happening. It’s a great way to radiate information and invite other to participate in collaboration.
Touchable
You can touch a sticky note. It’s real. When you’re discussing it, you can pick it up so others know which thing you’re talking about. When you’ve made progress, you can move it from an in progress bucket to a done bucket. When you’re done done with it you can tear it up and throw it away – my favorite!
Simple or Complex
Just like software tools, sticky notes can support just about any process and metadata you can invent. Want to track new features vs. defect – use different colored sticky notes. Want to track how many days something has been in progress – add a dot to the lower left corner of the sticky note each day. Want to track who’s working on what – give each person a unique sticker to add to sticky notes.
Repositionable and Groupable
Combined with a wall or window and some painter’s tape to create some buckets, sticky notes make a great system to track just about anything. And best of all – they’re built to be moved around over and over and over. Try reconfiguring your software tools to match your process that quickly!
While I also use many other agile tools, sticky notes are one of my favorite. Try it out yourself and see if you become a believer! Or check out some other team’s sticky notes.
-Adam
Tales of Tools and a Panning Timelapse GoPro Timer.
So what is the best tool in TinkurLab’s workshop? A Dremel! Hands down! Well, 1 minute quick drying epoxy is a close second, but let’s save that for another day.
It can cut. I can grind. It can drill. It can route. It can sand. It can polish. It’s pretty small. You can get new bits cheap.
So how did we get to this epiphany? I recently completed a small project to update my timelapse panning timer – affectionately named TinkurLapse. TinkurLapse rotates my GoPro camera as it takes a photo every few seconds, creating a panning timelapse video. My original timer was basically a $2 Ikea kitchen timer with a ¼”x20 screw glued to it to screw into the GoPro’s tripod mount. BTW a standard camera tripod mount takes a – you guessed it – ¼”x20 screw. Remember that for your next trivia night. Anyway, in the spirit of most Tinkurlab projects, the goal was to get a minimally viable product out the door quickly to start really learning about it and to make improvements. “Learning by doing” in other words.
And learn I did. While the first version of TinkurLapse worked, it had a few issues. First and foremost, having a tiny base, relatively high height, and very light weight resulted in an unbalanced base for the GoPro which easily fell over. Many early experiments resulted in setting the camera up for a 60 minute timelapse only to return after an hour to find the camera lying sideways rotating the timer under its base. Pretty uneventful video! The other issue was its size. While it wasn’t huge, it wasn’t small either. Given that most of my GoPro shooting occurs during travels and adventures like hiking, skiing, etc, it would be ideal to make the panning timer as small as possible. While the first version of the panning timer worked, I didn’t use it very much. It just wasn’t good enough for me. I value it provided didn’t overcome the cost of using it. My goal of creating awesome panning timelapse videos was blocked.
So one day while aimlessly wandering the Internet, I decided to search for a prebuilt device. I know – it was a moment of weakness. How un-DIY of me. There are not very many panning timelapse devices on the market (at least not under less than a few $100s for professional use), but while browsing through the catalog of Photojojo, I came across the Camalapse. However, aside from the $30 price tag, it has the same problem as the first version of TinkurLapse – not stable enough. Camalapse is only 2 ounces, with a small base. A featherweight. However, it does support mounting to a tripod to provide a more stable base, so it has potential.
After knocking some sense into myself, I set out to research ways to improve the initial design. I found a great video from a serious panning timelapser, who posted a great tutorial. The tutorial suggested using the same Ikea kitchen timer I used in the first version of TinkurLapse – no problem, I bought 3x just for this reason. The tutorial also had some other great ideas – remove the done bell as to not scare all people and wildlife in 1 mile radius, and mounting the timer to a Gorillapod tripod to provide a more stable base. Perfect!
Enter the Dremel. After some drilling, grinding, and gluing I had completed version two of TinkurLapse. On to testing. The initial test run resulted in rotation of about 90 degrees before stopping. After some investigation, I determined that excess glue and some of the timer’s parts were obstructing the rotation of the GoPro mount. What to do? Dremel it! After some precise grinding and sanding, I had trimmed down the responsible obstructions. The next test passed without issue, even removing the annoying done bell at the end.
TinkurLapse v2 is ready for some more real world testing! I’m looking forward to a few upcoming skiing trips to use the new TinkurLapse. I’ll post some videos. Until then, happy tinkering!
Example Timelapse Video 1
Example Timelapse Video 2
-Adam