It was very inspiring to hear from both Industry Innovators as well as leading educators in the field. I’ve been trying to wrap my head around how to best support teachers in integrating Virtual Reality, Augmented Reality, and mixed reality in their classrooms, and appreciated the perspectives that both industry and educational folks shared. It was great to get both sides of the table in one place to begin the conversation!
Check out my Google Photos Album here to check out some key moments as well as some presentation slides.
“This is a bundle of patches intended to help get you off the ground in using mobile phones to control Max.
As I like to have several phones connecting to Max in a network setup, I use of udpreceive and udpsend so that ports can be dedicated to specific devices. I don’t know how to make use of Bonjour, so feedback is welcome!
For Android users, I have created a patch that interfaces seamlessly with the TouchOSC ‘Simple’ layout, so even if you cannot add custom layouts, you can still make use of some these patches.
Currently the best setup is either an iPhone, iPod Touch, or iPad on TouchOSC, but hopefully this will change as Android becomes more robust.”
On their About page, their Vision Statement: “Pachube (“patch-bay”) connects people to devices, applications, and the Internet of Things. As a web-based service built to manage the world’s real-time data, Pachube gives people the power to share, collaborate, and make use of information generated from the world around them.”
It 2008 Haque made released a framework that set up a way to “handle real-time data from sensors in interactive environments for his design practice.” This is another piece of the crowdsourcing movement, and the ability to track this much data and run it through this Pachube hub is very awesome. I guess in July 2011, LogMeIn acquired Pachube.
Anyway, Pachube used to be a paid service, but they recently (this month) made the service completely free for all users. An Overview of the Pachube API Documentation gives you a sense of how things work. There is also an API Quickstart that is useful if you know curl or hurl, command-line communication for scraping and posting data.
So last week, I started wearing the FitBit, a biofeedback sensor that you can clip onto clothing and wear 24 hours a day to provide you with real-time data about your activity level, pedometer (steps taken), distance meter, calorie counter, and sleep tracker. And all of this by using an accelerometer (motion sensor), calculating a few key algorithms to derive this information from the x,y,z values of a motion sensor, and then displaying it on the the associated website. (FitBit Setup Instructions are posted toward the bottom of this post.)
The FitBit website seems to be an incarnation of the HealthGraph API, which was made open-source by the folks behind the RunKeeper cross-platform app phenomenon. To understand this ‘movement’ (and I use the term literally) better, I registered for a RunKeeper account to see what this all invovled. You log in, set up your profile user page, and if you link it to your Facebook account it creates a “Street Team” from your friends list, with whom you seem to compete or at least share acts of physical prowess that are logged on the site. Runkeeper App is available on iPhone, Android, Windows 7 Phones, and Nokia. The power of this platform seems to be once again, the Health Graph API:
On this site, they speak of the “social graph”, which has evolved into the “Open Graph” described as “a system of connections that includes not just personal relationships, but also your personal ‘likes’ and interests. Any website, individual or group that you ‘like’ is eligible for inclusion in your open graph.” They say that this Open Graph is missing a few things that the ‘Health Graph’ can fill in:
Changes to your body measurements over time – a ‘like’ relationship doesn’t include a time factor.
The impact of your activities, sleep patterns and nutrition on your health and body measurements.
An analysis of the elements of your social graph that will help motivate you to reach your fitness goals.
Further this app seeks to:
Identifiy periods of weight loss in your fitness history.
Establish correlations between weight loss & nutrition, sleep, activity performance, social motivation, etc.
Visually display the relationship between these health data points.
Runkeeper claims to have over 6 MILLION USERS, so this seems like it could be a great example of philanthropic Crowdsourcing. With so much data available for analysis, there will inevitably be some interesting correlations, discoveries, and the ability to refine the program. Having access to all of this data would indeed be very interesting for visualization and a “Crowdsourced” Interactive Art Piece.
So, the Runkeeper HealthGraph API is being used in the following devices:
Here is a graph of some of the Data Points available to correlate:
I think the best way to complete a “Demo” for others to use this data will actually be to develop an web-based app myself that is reverse engineered from the Simple Client Example.I may be able to feed this data into Pachube and access it for use in Max/MSP that way with the MaxPatch that is available from the Pachube forums.
Another way that I am exploring is to use QDot’s (NonPolynomial Labs) LibFitBit. I’m starting with this page to make sense of it all… do I need to learn Python for this?
This combo has a simple Piezo transducer hooked up to the Arduino, and is “advertised” as an Earthquake Detector, but I wonder if you could use it to detect motion on a floor or surface when feet are stepping…. Files available HERE
(From HERE): You can send multiple values from the Arduino board to the computer using the serial print function. In this project, the analog values from three potentiometers are used to set the red, green, and blue values of the background color of a Processing sketch running on the computer.