Stockpile Stewardship

National Atomic Testing Museum,
Stockpile Stewardship video players

The video player application was built as a desktop application for the Surface Pro 8 tablet. Each application plays videos in fullscreen.

This project was built using open source Javascript frameworks including Vue and Electron.

Segesser Hides

NM History Museum,
Segesser Hides Interactive

The Segesser Hides are paintings on tanned bison hides that depict early Spanish colonial life in the New Mexico.  The interactive allows users to zoom in to get a detailed look of the hides and learn more about the unique sections of the paintings.

This project was built with HTML5, CSS, Javascript and open source Javascript libraries. It uses assets from the original Adobe Flash version of the Segesser Hides interactive.

Adobe Flash is outdated, proprietary and not supported on modern browsers and devices . By converting from Flash to web programming languages, the new interactive is now available in the museum’s gallery on a touchscreen and on the web.

Acadia Nature Center Tour

Sieur de Monts
Acadia Nature Center tour app
developed with Swift

This mobile tour of Acadia National Park’s Nature Center is a tool for visitors who are blind or have low vision, but is useful to anyone who wants an in-depth experience of the new exhibits. The narrative, linked to locations in the building describes how the park may be changing as the climate changes. The tour includes suggestions for moving between exhibits starting at the service desk and then moving sequentially through the 9 exhibits. There are exhibit descriptions and in some places, guides to deeper explorations. You can listen to audio description performed by actors or by turning on Apple’s Voiceover you can listen to identical text descriptions.

Download from the iOS app store

Oldfarm Mobile App

Oldfarm mobile app
Acadia National Park
developed with Swift

Guided tour of the remains of the estate of George B. Dorr, founder of Acadia National Park in Bar Harbor, Maine. Uses map and overlays to locate cultural areas and provides captioned video interpretation for a variety of stops along a prescribed pathway. Voiceover friendly.

Download from the iOs app store

Beacons for Accessibility

I have been working on a beacon based app for visitors with blind/low vision accessibility needs as a pilot project for Acadia National Park. The overall goal is to allow visitors to follow along tour path of 9 stops. To keep track of the visitors position, I am using the Estimote Indoor location beacons.

screen shots:


SA Andree

Anderson-Abruzzo Albuquerque International Balloon Museum
S.A. Andree Exhibit
55′ inch touch table interactives

I worked on the development of two 55′ inch touch table interactives for an exhibit on S.A. Andree’s failed attempt to get to the North Pole via hot air balloon in 1897.

Learn More…

Touch Table Interactive

I have been working on two 55′ inch touch table interactives for the Anderson-Abruzzo Albuquerque International Balloon Museum and their upcoming exhibit on S.A. Andree’s failed attempt to get to the North Pole via hot air balloon in 1897.

Early Prototype screenshots:

The first touch table will be embedded in a antique table and housed in a space designed as a victorian study. The table will feature vintage diaries that contain content about SA Andree’s expedition plan and other polar expeditions at the time.

The second table will be embedded in an iceberg shape and housed in a space referred to as the Ice Room. The table will feature content based on the events of SA Andree’s flight and artifacts found at the crews final camp on White Island.

The framework for both interactives have been developed with Hammer.js, an open-source javascript gesture library and NW.js, a tool which packages and distributes node.js based web apps as desktop applications.

demo code

Gear VR

Gear VR
Aside from the Gear VR headset, the Innovator Edition package includes a 16GB MicroSD card and adapter, a screen cleaning cloth, and a soft-shell travel case.



The Mobile SDK from Oculus can be found here:

Virtual Reality

Unlike augmented reality, virtual reality (VR) is a simulated environment based on fictitious or real world locations. Virtual reality can also allow users to have a virtual presence in simulated environments. VR requires specialized hardware at a high cost for its head-mounted display and head tracking.

In 2012, the Oculus Rift released the Development Kit 1 as one of the first VR headsets of its kind. The Oculus Rift HMD is powered by a computer via USB and includes sensors such as an accelerometer, magnetometer and gyroscope. With an SDK and support for Mac, Windows, and Linux, the DK1 placed inexpensive VR in the hands of developers and gamers. Two years later, Oculus released Development Kit 2 while other HMDs aim to compete.

Google later launched Cardboard, a more accessible and inexpensive HMD. Cardboard allows any Android mobile device running the appropriate software (Jelly Bean and above) to simulate a VR experience similar to the Oculus Rift. However, most Android devices still have insufficient performance and displays that are not equipped to handle low latency VR.

When it comes to latency, I am referring to the display’s response time for head tracking. To make a VR experience as immersive as possible, every millisecond counts. 20ms or less is the ideal level of latency for reduced motion blur induced from head movement, enabling more realistic scenes with richer colors and contrast. Anything higher will have a noticeable lag. High latency can lead to a detached experience and contribute to a user’s motion sickness or dizziness.

Gear VR is the collaborative effort by Samsung and Oculus, and a cross between the Oculus Rift and Google Cardboard. The display is the Samsung Galaxy Note 4, a compatible android mobile device equipped with a ‘Quad HD’ display at 2560×1440 pixels per inch. This is the highest resolution screen seen on anything approaching a consumer VR headset. The OLED display also enables two key elements: a near-instant pixel switching time and the ability to produce truly black pixels. The mobile device is connected to the Gear VR via micro usb. While the display is powered by the mobile device, the Gear VR headset itself has built in Oculus Rift head-tracking module enabling more accurate and lower-latency tracking than Cardboard or other headsets that solely rely on standard mobile sensors.

vr comparison chart

Cystic Fibrosis Cough Monitor

Cystic Fibrosis is a recessive genetic disorder that affects the lungs, as well as pancreas, liver, and intestine. The CF gene produces a mutant protein that interferes with cells’ ability to manage chloride. This causes secretions throughout the body to thicken, and turn dry/gluey. In the ducts of the pancreas, the flow of digestive enzymes becomes blocked, making a child less able to absorb food, which leads to poor growth. However, the effects on the lungs are what make CF lethal. Thickened mucus will slowly fill the small airways of the lungs and harden, thus shrinking the lung capacity.Typically, a child with CF has to be monitored daily for energy levels, appetite, and cough frequency. This data is useful in determining treatments.

The Little Devices Lab at MIT focuses on the design, intervention, and policy spaces for DIY health technologies. By developing a series of medical technology prototyping kits, the Lab has provided new materials and “plug-and-play” components to help nurses and doctors expand their ideas and solutions for better patient care.

The Lab was approached by a parent/nurse to develop a wearable sensor to measure and record her child’s cough frequency. Ultimately, the wearable sensor would be able to transfer the data to a smart phone (or some other means for storing data) enabling physicians and parents to keep track of a child’s health.

Through the opportunities of the MSSD program, I was able to prototype the wearable cystic fibrosis cough monitor for the Little Devices Lab using open-source electronics and software.

My initial tests were to determine cough occurrences based on movement and sound. I began the project using an Arduino, as well as two breakout boards for accelerometer and microphone data. However, these components are bulky. When it comes to wearable electronics, size and power matter. Since the CF monitor is intended to be a wearable device, I researched other microcontrollers. The relatively small Flora and can be sewn into fabric and washed. It also makes use of rechargeable micro-lipo batteries. With the onboard 3.3v 150mA regulator, the Flora can power most common modules and sensors. I then migrated my tests from the Arduino version to the Flora.

CF Monitor

For this prototype, I used a velcro strip as an armband to hold the components.


Metaio Tutorial

Creating a New AR Project with Metaio and Unity

1. Create a new Unity project.
Right click in the Project panel and select Import Package→Custom Package to import metaioSDK Unity package.

2. Add a user layer
Click on Layers drop down at top-right corner of the Editor and select Edit Layers. Add a new layer in next available User Layer, e.g. call it “metaioLayer”.

3. Set Application Signature
After creating a signature, you could start creating your own application. Delete Main Camera object created by default and instantiate a metaioSDK prefab by dragging and dropping it into hierarchy view of your project. Click on metaioSDK to edit the signature in the inspector window.

4. Add Tracking Config
Create folder within StreamingAssets folder and include tracking configuration with respective patterns.
Select metaioSDK in the Hierarchy panel and then in the Inspector view select StreamingAssets in a drop down menu Select Configuration. Then you could simply drag and drop your tracking configuration file to the Inspector view.

5. Binding Unity game objects with tracking data
Instantiate a metaioTracker prefab by dragging and dropping it into hierarchy view of your project. Metaio Tracker instance is a reference in your Unity project to a corresponding COS in your tracking configuration file. Note that you have to set Coordinate System ID that corresponds to a valid COS in the currently set tracking data. Once it has been done, you could attach new game objects to metaioTracker instance.

6. Build and Run for Platform
Check Player settings and set application signature before building and running.

Manually add the following libraries and frameworks in Xcode:
…/Project Folder/Libraries/metaioSDK

If using Metaio SDK below 6.0, you must modify the sensors.h, sensors.m, and .plist for GPS support.

Screen Shot 2015-03-23 at 4.53.10 PM

Screen Shot 2015-03-23 at 4.52.56 PM

Screen Shot 2015-03-23 at 4.56.37 PM

If Android device was plugged into the computer via usb, Build and Run in Unity will place the android package file on your device for you. Otherwise, build a development build of the Unity app for Android Studio.

Metaio Example

For my thesis, I have begun developing a cross-platform augmented reality app with the Metaio SDK and Unity3D.

Metaio SDK

Metaio stands out from the other SDKs as it offers a variety of AR solutions. The free watermarked version of Metaio SDK is currently supported on Android, iOS and Windows with an additional plugin for development in Unity3D for Android, iOS, Windows and OS X platforms. There is also support for Google Glass and Epson Moverio BT-200 glasses. To remove the watermark on applications, you must purchase a Metaio license.

An alternative to the SDK is Metaio Creator, an augmented reality software that allows users to create a complete AR scenario without specialized programming knowledge through a drag and drop interface. Additionally, Metaio is the creator of Junaio, a free mobile AR browser available for Android and iOS devices. Junaio allows users to experience mobile augmented reality through multiple channels on their mobile devices. Another solution for creating interactive Augmented Reality is AREL (Augmented Reality Experience Language). The scripting is based on HTML5, XML and JavaScript. Arel can also be used for creating Junaio channel content.

Unity is a development platform for creating 2D/3D games and interactive experiences across multiple devices, such as iOS, Android, Desktop, or Game consoles. Below are some steps for setting up the Metaio Example project in Unity.

Getting Started

Sign up for a Metaio account to download the SDK. Download the latest version of Unity. You can also sign up as a unity developer here.

Application Identifier and Signature

Each application needs a unique signature. The Application Signature refers to a string that matches to the Application Identifier. If the Application Identifier is changed, the corresponding Application Signature has to be changed as well.

If registered as a Metaio developer, login to create an app signature under my apps tab.
Screen Shot 2015-03-25 at 12.10.14 AM
Metaio SDK signature

Example App in Unity

1. Start Unity and go to File->Open Project…, select the following directory: \_Unity\Example

2. Open Unity Example Project, go to File->Build Settings and select desired platform. Unity will automatically re-import required assets according to the platform.

Screen Shot 2015-03-25 at 12.11.10 AM

3. Set Application Identifier and Signature
Select Player settings to edit the Bundle Identifier.
Select MetaioSDK from the hierarchy and edit the SDK signature appropriately.

Screen Shot 2015-03-19 at 6.23.19 PM

Screen Shot 2015-03-18 at 1.10.32 PM

4. Click Build and Run
For IOS, open the XCode project exported from Unity. The SDK and some of its dependencies have to be added manually to the Xcode build settings. Go to Build Phases and add the following frameworks and libraries:


For Android, there are two options: Export a development build of the Unity app for Android Studio or plugin your Android device via usb and Build and Run to install the android package file directly to the device.

Note that you don’t necessarily have to test on a mobile platform – you can also just hit the Play button to run the examples directly in Unity Editor. Choose the platform in the Build Settings menu.

Augmented Reality

Augmented Reality (AR) allows for interaction with the physical world. Any element in the real world can be augmented by sound, video, image, or 3d object based on GPS data and image or marker recognition. These augmented elements are typically overlaid on a live camera feed via webcam or mobile device and now wearables such as “smart goggles/glasses”. There are many Software Development Kits (SDKs) available that provide tools and libraries to easily develop AR applications on multiple platforms, such as iOS, Android, Windows Mobile, BlackBerry, or Desktop (OSX, Windows, Linux). With that said, each SDK has its own features and limitations for its intended platforms.

Below is a list of features provided by the leading AR SDKs.

Interactive Haunted Hallway

This Halloween I was tasked with designing a few themed interactives.


I came up with a couple of ideas, one being a choose your own adventure style interactive where participants can select a level of fear they are comfortable with. On a scale from 1 to 5 (1 being the lowest), participants are unknowingly selecting a possibly creepy halloween themed gif to be projected on a scrim in the hallway they will walk through.


Using foam board, I constructed a basic control box to house all the electronic components.


The arduino is connected to breadboard with LEDs, potentiometer, and servo motor.


The arduino is also connected to a macbook with long HDMI cable connected to pico pocket projector mounted in the hall.




I planned to have spiders drop down from the ceiling when movement is detected. I started prototyping with servo motors but in order to wind up the spider in both directions, I then moved onto a couple DC toy hobby motors from Adafruit. However, working with DC motors was quite the challenge. There was so much troubleshooting involved and the hobby motors I got turned out not to be ideal. I had purchased a few different sized halloween spiders, but the motors torque and size could not handle the weight of any of the spiders. Due to this issue, I constructed some spiders out of pipe cleaners. Eventually, everything worked.


Again, I constructed another foam board box to be mounted to the ceiling/wall and house all the electronic components.


A PING sensor is long enough to be mounted to the side of the wall. This will be triggered when participants walk by.


Watch this video to see the interactives in action.

For further documentation of the Haunted Hallway project, feel free to download this PDF.

Stretch Sensor

Conductive Rubber Cord (Stretch Sensor)

Similar to a thermistor, the program measures the analog voltage and converts that back to resistance. When stretched, resistance is detected and the conductive rubber cord lights an LED.


// Sensor pin - GND
// Sensor pin - Analog In 0, with 10K resistor to +5V
int LedPin = 13;    // LED connected to analog pin 13
int SensorPin = A0;    // Sensor connected to analog pin A0
void setup()
    // initialize serial communications
void loop()
    // read the voltage from the voltage divider (sensor plus resistor)
    int sensor = analogRead(SensorPin);
    analog range to output
    map(value, fromLow, fromHigh, toLow, toHigh)
    value: the number to map
    fromLow: the lower bound of the value's current range
    fromHigh: the upper bound of the value's current range
    toLow: the lower bound of the value's target range
    toHigh: the upper bound of the value's target range
    int output = map(sensor, 30, 70, 0, 255);
    // print out the result
    Serial.print("analog input: ");
    Serial.print(" output: ");
    analogWrite(LedPin, output);    
    // pause before taking the next reading

Conductive Rubber Cord from Adafruit




Colorado National Monument

Over the last few months, I’ve been researching and developing an augmented reality iOS app for Colorado National Monument in Grand Junction, CO. The key feature of the app allows visitors to stop at augmented waysides in the park.

Moisture Sensor

If It Rain, It Tweets!

Sensor + Arduino + Xbee transmitting data to Raspberry Pi + Xbee that tweets if moisture has been detected.

How does it work?

The moisture sensor is an Arduino connected to Xbee Adaptor and a printed circuit board with 555 timer. Leads are connected to a sponge that send resistance data to the circuit board. Arduino reads the data, adds the time and transmits the data to an Xbee connected to a Raspberry Pi. The Raspberry Pi sends out a Tweet with moisture status, e.g., “Extreme Moisture”.


Xbee (transfer/recieve), breadboard, leads, sponge, Arduino Uno, Raspberry Pi, USB wifi dongle, printed circuit board for 555 timer to take moisture readings

Moisture sensor using a 555 timer IC (Express PCB)

A way of quantifying the amount of moisture in a porous medium is to measure its electrical resistance. Typically, the higher the water content lower its resistance. Under most circumstances the resistance is in the mega-ohm (million ohms) range. As a reference a 100 feet of copper wire that would be used in house wiring has the resistance of a few Ohms at most. A moist medium has one-million times more resistance. We wanted to make a sensor that will give a number that is compatible with the kinds of data that an Arduino can comfortably handle, that would be an integer and between 1 and 20,000 or so.

One way of doing this is to make an resistor-capacitor (RC) circuit we could somehow charge up the capacitor and let it discharge through the moist resistive medium. A common 555 timer integrated circuit chip (IC) uses an RC circuit for its time base. We have exploited this feature to make a moisture sensor that can be triggered and read by an Arduino. The 555 can be configured in a number of ways for this application, we chose to use as a monostable-multi vibrator, a one-shot. Send the circuit a trigger pulse and the 555 will produce an output whose duration is determined by the values or the resistor and capacitor used for its time base. In this case the resistor will be the moist medium and we can choose a capacitor that will produce a pulse of 100’s or a few thousand milliseconds long (0.1 sec to a few seconds).

We tested it with damp paper towels and sponges to determine that a 1uF capacitor works well giving a pulse out of about 500-1000 milliseconds. We decided to measure moisture in milliseconds. The shorter the time, the higher the water content of the medium. Since we were going to deploy a number of sensors we designed a small printed circuit board for the sensor sent our design off to to make the bare board.

555 moisture sensor schematic

555 moisture sensor schematic

555 moisture sensor

Arduino + Xbee  transmits to Raspberry PI + Xbee

Each setup has its own script for receiving or transmitting data.



moisture sensor

Initial Raspberry Pi Set Up

To streamline the development process, we set up remote access via ssh. To complete this, we connected a usb wifi dongle to the Pi. The wifi connection is also important to the set up as we need internet access for the twitter implementation.

Serial Communication with the Raspberry Pi

During development, we researched Raspberry Pi GPIO and USB serial communications. One method for accessing the the GPIO pins is through UART, or universal asynchronous receiver/transmitter. UART is the piece of hardware that translates data between parallel and serial forms. By default, the the Raspberry Pi serial is configured to use the UART for console input and output. However, that means the Serial Port could not be used in our program. In order to access the dedicated UART pins on the raspberry pi, the serial ports needed to be reconfigured. After a few modifications, we can then receive communications. We later came to the decision to make use of an FTDI usb cable connected directly to an Xbee receiver.

We chose to use pySerial, a module that allows for access to the serial port in your python script. Like other python modules, pySerial needs to be installed and then imported in your program.


The serial data created by the moisture sensing circuit board is transmitted from the Arduino to Raspberry Pi between Xbees via radio frequency.

raspberry pi + xbee

The Pi uses pySerial to accept that data. It begins by importing data from the specified USB port. Each Xbee that is transmitting data has been given a specific numerical name that is converted into a readable ID of Xbee 1 or Xbee 2 when its presence is detected.


Based on the values we were seeing in our tests, we specified any value of 3500 or above to mean that there is no moisture detected. A moisture status message is sent to the console and Twitter every time data is received. A value in the range of 700-1499 has a moisture status of “Extreme Moisture”. Likewise, 1500-2249 is “Medium Moisture”, and 2250-3500 is “Minimum Moisture.” Each time moisture is detected, the program resets itself to false and waits for the new data to be received.


Twitter Implementation
To access the Twitter API via Python, we chose to use Tweepy, the well known Twitter for Python library. In order to set up Authentication keys, we signed up for the Twitter Developer account.



Pi Beacon

Rasp Pi

What are Beacons/iBeacon?

Beacons are hardware that use Bluetooth Low Energy to advertise a Universally Unique Identifier (UUID) to devices, like a smartphone. iBeacon is Apple’s protocol for accessing the UUID data advertised by a Beacon. This technology allows for proximity sensing and indoor  positioning. For example, a device can alert a user if they are in range of a beacon and enable information based on that location.

Beacon hardware, such as Estimote or Gimbal, are fairly inexpensive. However, it is also really simple to build your own with a Raspberry Pi.

Resource: How to make a Pi Beacon from Radius Networks

Raspberry Pi Beacon transmitter

Using a Bluetooth LE Dongle connected to a Raspberry Pi and a script to transmit a UUID, the Raspberry Pi can act as a Beacon.

Bluetooth Dongle

Resource: Installing Bluetooth on a Raspberry Pi from

To advertise the UUID on startup, I ran the script from /etc/rc.local

edit /etc/rc.local command

~$ sudo nano /etc/rc.local

While inside the editor, modify /etc/rc.local so that the raspberry pi can navigate to the directory of the beacon script and run it.

cd /home/pi/PiBeacon
sudo bash startBeacon

startBeacon script

stopBeacon script


iOS Beacon Receiver app
After reading the iBeacon documentation and researching the Estimote SDK, I developed a sample application that displays a different clickable icon based on the level of proximity to my Raspberry Pi Beacon. Depending on which icon is clicked, different content is displayed.



Levels of proximity:

Pi Tweet

Install wifi dongle for internet access on the Pi

Tweepy is a Python library for accessing the Twitter API. In order to use the library, I first had to set up Authentication keys by signing up for the Twitter Developer account.

raspberry pi tweet

sample python tweepy script:

#!/usr/bin/env python

import sys
import tweepy

#api keys from twitter

auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
api = tweepy.API(auth)

terminal command:

python "Hello, from my Pi"

Pi Photobooth

I connected a picam and a thermal printer to a raspberry pi. When you press a button, the camera will take a picture and immediately print the image.


Raspberry Pi
Mini Thermal Receipt Printer
jumper wires
10k resistor

Thermal Printer Tutorials

Little Box of Geek Project from Geek Gurl Diaries
Part 1
Part 2

Picam Tutorial

Python PiCam Setutp from Raspberry Pi






Interactive Marionettes

Gustave Baumann was born in Germany and emigrated to the United States at the age of ten. While he is known as an American artist, primarily as a printmaker, his marionette skills were deeply rooted in his German heritage.

The New Mexico Museum of Art in Santa Fe owns a collection of over seventy marionettes carved by artist Gustave Baumann during the 1920s and 30s. Their age and fragility means that they are rarely displayed and never used as designed. Figuring out a way to surface this collection and to encourage an interactive experience has been a goal for the collection’s curators for a long time.

The museum approached the development team from Media Arts & Technology at New Mexico Highlands University. In order to allow visitors to interact with a rarely seen collection, five of these unique marionettes have been scanned via photogrammetry. Using current technology, the scans have been repurposed to demonstrate the articulation of the marionettes.

The project involves the Microsoft Kinect hardware and software written for Unity3d to allow the user to embody a marionette. I recently prototyped another version which uses the Leap Motion Controller hardware and Javascript to provide the user with the fine-tuned experience of controlling a marionette’s movements.

The Leap Motion Controller has a significantly finer range than the Kinect and is an ideal solution to complete the user experience. Using the Javascript framework for Leap, marionette data is attached to ragdoll physics bodies which respond to fine movements of the user’s fingers and hands. The idea was to mimic the control planes used by traditional puppeteers, but I discovered the Leap hardware has some sensitivity issues sensing the roll of a user’s hand. Pitch and yaw are easily readable, but this would give an incomplete experience, so I modified the metaphor to give the feeling of controlling the strings directly with the user’s fingers.

The Baumann Marionettes Interactive will be included in the exhibit “Gustave Baumann: A Santa Fe Legend”, which opened at the Las Cruces Museum of Art in February 2014.

demo site:

Leap Experiment

I’m interested in gesture based interfaces and have been learning about the Leap Motion controller, which is a device that senses hands, fingers and multiple gestures, allowing you to interact with content in a new way.

Leap Motion controller

Leap Experiment

Below is some source code I wrote for an example that tracks a color changing circle to your fingers. If you have a Leap, check out the demo.



document.addEventListener("DOMContentLoaded", init);

var canvas;
var ctx;

function init() { 
	canvas = document.getElementById("leap-canvas");

	// fullscreen
	canvas.width = document.body.clientWidth;
	canvas.height = document.body.clientHeight;

	// create a rendering context
	ctx = canvas.getContext("2d");

	//change color of pointable at interval
	setInterval(color, 50);
	// listen to Leap Motion

function color() {

	//rgb color
	this.r = Math.floor(Math.random() * 255);
	this.g = Math.floor(Math.random() * 255);
	this.b = Math.floor(Math.random() * 255);
    ctx.fillStyle = "rgb("+ this.r +", "+ this.g +", "+ this.b +")";

// render each frame
function draw(obj) {
  // clear last frame
  // render circles based on pointable positions
  var pointablesMap = obj.pointablesMap;
  for (var i in pointablesMap) {
    // get the pointable's position
    var pointable = pointablesMap[i];
    var pos = pointable.tipPosition;

    // create a circle for each pointable
    var radius = Math.min(600/Math.abs(pos[2]),20);



Download from Github

Wii Paint

Screen Shot 2014-04-14 at 10.59.57 PM

Wii Paint is a desktop application I wrote in Objective-C. It works in use with a Wii Remote, a battery operated IR Sensor Bar, and DarwiinRemote, an application that maps keyboard keys to Wii Remote buttons.

wii sensor

Wii Remote

Keyboard Keys for WiiPaint Desktop application:
n is for new canvas (clears the screen)
b goes back in the array of colors
c goes forward n the array of colors

Using IR mode will map mouse movements to Wii Remote movements with an IR Sensor.

Darwiin Preferences:
Darwiin controls

Emergence Interactive Timeline

Life’s origins is represented as series of events depicted through earth’s geosphere and biosphere. The exhibit timeline also includes specimens and illustrations of early earth, as well as the formation of earth.

Design and develop an interactive version of the exhibit timeline.
Emergence Interactive Timeline
text (bio/geo content)
images (earth illustrations)
audio/video (scientist interviews)
easily up-dateable (by museum staff)
designed for iPad (for use in exhibit)
accessible on the web (for use in classroom)

Use of the following:
HTML5/CSS3 + Canvas XML
fancybox + jQuery

The overall challenge of this project has been the fact that it is still early in the HTML5 adoption cycle and there is limited documentation on Canvas implementation.

– Re-sizing based on browser window
– Click Coordinates
– Drawing Video on iPad

– Cross-browser/platform Testing
– Use Open Source Frameworks/Libraries/Tools (fancybox)

Demo Site (best viewed in Chrome Browser):

Media Queries

Below is a short example of my use of CSS media queries for the Alcoves 12.0 responsive website.



Navigation HTML

Navigation CSS

nav {
    width: auto;
    height: auto;
    margin-top: -80px;
    margin-right: 40px;
    padding: 0;
    float: right;
    /* background */
    background: #404c5f;

nav ul {list-style: none;}

nav ul li {
	float: left;
	position: relative;

nav ul li a {
	font-weight: bold;
	color: #fff;
	display: block;
	padding: 10px 20px;
	text-decoration: none;

nav ul li a:hover {color: #7791ae;}
nav ul li:hover, active {background: none;}
nav  ul li ul { display:none;  background: #4c617a; }
nav  ul li:hover ul { display:block; position:absolute;}
nav  ul li ul li {color:#fff; width: 180px; display:inline-block; float: left;}
nav ul li:hover ul li {background: none;  }
nav ul li:hover ul a {padding: 5px 20px;}
nav ul li:hover ul a:hover {background: #647d99; }

@media screen and (max-width: 720px) {
  nav {
	width: 100%;
	height: auto;
	margin: 0;
	padding: 0;
        position: relative;
	/* box shadow */
	-webkit-box-shadow: inset 0 1px 0 rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);
	-moz-box-shadow: inset 0 1px 0 rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);
        box-shadow: inset 0 1px  rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);

    nav ul li a {
	font-weight: bold;
	color: #fff;
	display: block;
	padding: 10px 40px;
	text-decoration: none;
    nav ul li a:hover {color: #fff;}
    nav ul li:hover, active{background: #58595b;}

AmeriCorps 2012-2013

I’m currently in my third term of the program interning at The New Mexico Museum of Art, as well as the New Mexico Museum of Natural History & Science.

At The New Mexico Museum of Art I was tasked with creating a gallery kiosk to showcase artists from the Alcove Shows. Alcove 12.0 included the work of 45 artists from across the state of New Mexico. Five new artists were exhibited every five weeks. The Kiosk would allow visitors to preview work from previous weeks. Throughout the remainder of the internship, I programmed various gallery interactives to include additional videos and/or photos for exhibits. I also designed templates to be used for events, as well as a redesign and style guide for creating  email blasts.

While at the New Mexico Museum of Natural History & Science, I was able to revisit an exhibit I worked on as a student in 2011. As part of National Science Foundation funding, Emergence: A New View of Life’s Origins  exhibit was to provide educational resources and teacher materials. I redesigned the Emergence website and included a resources section. I also designed promotional materials. Promo materials and resources that could be placed on custom USB drives were mailed to science education teachers in New Mexico.

Overall, I think being able to gain experience in various institutions has prepared me for future employment.  My experience with AmeriCorps has been invaluable.

MW 2013

Last week, I attended Museums and the Web in one of my favorite cities, Portland. I served as a conference volunteer, as well as a representative for NMHU and AmeriCorps. It was a really great experience being able to network with others in the museum and technology field.

Here is a short video I compiled after my trip.

C++ Black Jack

Below is a Black Jack text based game I developed in C++.

Screen Shot 2013-06-15 at 8.22.29 PM
blackjack.cpp file

// Name        : blackjack.cpp
// Author      : Rianne Trujillo
// Version     :
// Copyright   : 
// Description : Black Jack in C++


#include "dealer.h"
#include "player.h"

using namespace std;

int main() {

	//display game title
	cout << "BLACKJACK\n" << endl;

	//player class + human instance
	Player human;
	//if playerName is set to default, ask for name
	if ("default") {
		std::string playerName;
		cout << "Enter Your Name: ";
		cin >> playerName; = playerName;

	//Print Greeting
	cout << "Welcome " << << ". Let's Play!\n" << endl;


	return 0;


player.cpp file

 * player.cpp

#include "player.h"
#include "dealer.h"

Player::Player () {

void Player::play21(void) {

	//randomize the cards
	srand((int) time(0));

	Dealer dealer;
	Player player;

	// deal the cards
	int person = dealer.dealCards(2, "Your Cards:");
	std::cout << " = " << person;
	std::cout << "\n";
	int computer = dealer.dealCards(2, "Computer's Cards:");
	std::cout <<" = " << computer;
	std::cout << "\n";
	// Ask if human wants a hit

	char takeHit = 'y';
		while (takeHit != 'n') {
			if (person < 21) {
				//does the player want a hit?
				std::cout << "\nDo you want a hit (y/n)? ";
				std::cin >> takeHit;
				//If yes
				if (takeHit == 'y'){
					//Deal a card
					person += dealer.dealCards(1,"\nHit:");
					std::cout << "\n";
					std::cout << "Total: " << person << "\n";

				} else //the player does not want another card
				takeHit ='n';
			} else {
				//the player has busted
				if (person > 21)
				std::cout <<"\nYOU BUSTED!\n";
				takeHit ='n';

	//Determine if computer takes a hit
	while ((computer < person) && (computer <= 21) && (person <= 21)) {
		std::cout << "\n";
		computer += dealer.dealCards(1,"The Computer took a card: ");

	//show who won.
	dealer.determineWinner(person, computer);


void Player::newGame(void){
	char keepPlaying;
	//set variable for keepPlaying char to n
	keepPlaying = 'n';

	do {//do play21 function while keepPlaying = y

		//ask for keepPlaying input
		std::cout << "\nDo you want to play another hand (y/n)?";
		std::cin >> keepPlaying;
		std::cout << "\n";

	} while (keepPlaying=='y');

	if  (keepPlaying=='n')  {//if no, print game over
		std::cout << "\nGAME OVER.\nThanks For playing!";


player.h file

 * player.h

#ifndef PLAYER_H_
#define PLAYER_H_


class Player {
	// properties:
	std::string name;
	void play21(void);
	void newGame(void);

	std::string playerName;


#endif /* PLAYER_H_ */

dealer.cpp file

 * dealer.cpp

//dealer  shuffles cards and
//gives card to players when they ask for hit

#include "dealer.h"
#include "player.h"

Dealer::Dealer() {
	personScore = 0;
	computerScore = 0;


int Dealer::dealCards(int numCards, std::string message){
	//deal cards
	//set cardDealt and totalValue to 0
	int cardDealt = 0;
	int totalValue = 0;
	//print players cards to the screen
	std::cout << message << " ";
	//deal the number of required cards
	for (int i = 1 ; i <= numCards ; i++){
		//deal a card between 1 and 10
		cardDealt = Shuffle(1, 10);
		//if card dealt is equal to 1
		if (cardDealt == 1){
			//and if total value of card dealt is less than 10
			if (totalValue+=cardDealt > 10)
			//card is 11
			cardDealt =11;
			//card is 1
			else {cardDealt =1;}

		//accumulate the card values
		//totalValue is equal to the number of cards dealt
		totalValue += cardDealt;
		std::cout << cardDealt << " ";
	//return total value
	return totalValue;

void Dealer::determineWinner(int person, int computer) {

	//Display total scores
	std::cout <<"\nYour Score: " << person;
	std::cout <<"\nComputer Score: " << computer;
	std::cout << "\n";

	//Display winner
	//if person is equal to computer, its a tie
	//if person = 21, or => computer, or computer is > than 21, person wins
	//else computer wins
	if (person == computer)
	std::cout << "\nTie";
	else if ((person == 21 || person >= computer|| computer > 21) && (person <= 21))
	std::cout <<"\nYou Won!\n";
	std::cout <<"\nThe Computer Won!\n";

int Dealer::Shuffle(int lowerLimit, int upperLimit) {
	//returns a random number within the given boundary
	return 1 + rand() % (upperLimit - lowerLimit + 1);

dealer.h file

 * dealer.h

#ifndef DEALER_H_
#define DEALER_H_


class Dealer {
	int dealCards(int numCards, std:: string message);
	void determineWinner(int person, int computer);
	int Shuffle(int lowerLimit, int upperLimit);

	int personScore;
	int computerScore;


#endif /* DEALER_H_ */

Americorps 2011-2012

When my previous Americorp internship ended in August 2011, I decided to re-enroll for a full time position. In October, I continued interning at the same institution, while splitting time at the New Mexico Department of Cultural Affairs. Over the course of a year, I was able to continue utilizing and developing my skills.

At The Albuquerque Museum of Art and History, I was able to design and develop an iOS App that provided artwork and artist information for the ISEA2012 Exhibition installed at the Albuquerque Museum of Art & History.  I was also given the opportunity to design posters, brochures, and educational booklets. During Albuquerque’s centennial, I created a scavenger hunt using SCVNGR. I designed signage for how to play as well as a recognizable stickers to be placed near Artworks related to the scavenger hunt.

At the New Mexico Department of Cultural Affairs, I designed and developed the backend for the Office of Archaeological Studies Pottery Typology Project Website. The admin side interface allows users to enter pottery based types into a database. I also provided content management and developed registration forms for the New Mexico Association of Museum Annual Conference.

Pottery Typology

Americorps Internship Program
New Mexico Department of Cultural Affairs
Office of Archaeological Studies Pottery Typology Project

Developed admin side interface which allowed users to enter pottery types into database.

Visit Website

Alcove Shows 12.0

Americorps Cultural Technology Internship
The New Mexico Museum of Art
Alcove Shows 12.0

Designed and developed in-gallery kiosk and website for archiving the a series of nine exhibitions and the work of 45 artists from across the state of New Mexico.

Visit Website


App Interface

Americorps Cultural Technology Internship
The Albuquerque Museum of Art and History
ISEA 2012 iOS Application

Designed and developed iOS App that provides artwork and artist information for the ISEA2012 Exhibition installed at the Albuquerque Museum of Art & History.

App no longer available on the app store.

Americorps 2011

Sculpture Garden App

After graduating from New Mexico Highlands University in May 2011, I enrolled in the AmeriCorps Cultural Technology program. Soon, I began a minimum time position as an intern at The Albuquerque Museum of Art and History. My scope of work consisted of recording and editing audio, as well as designing and developing a mobile audio tour of the museum’s sculpture garden.

Sculpture Garden Mobile App

App Interface

Americorps Cultural Technology Internship
The Albuquerque Museum of Art and History
Sculpture Garden

Recorded and edited audio for mobile audio tour via OnCell Systems mobile web/cell phone tours.

Designed and developed Sculpture Garden iOS application. No longer available on the app store.

Listen via Oncell


PICT: Program for Cultural Technology
New Mexico Museum of Natural History & Science
Emergence: A New View of Life’s Origin exhibit

Collaborated on the design and development of “Emergence: A New View of Life’s Origin”, a permanent exhibit of the New Mexico Museum of Natural History & Science.

Visit Website


Seabury Fellowship in Cultural Technology
Award Recipient, Spring 2011
If Only

The Seabury Fellowship gives students the financial support and freedom to produce independent portfolio projects. As a recipient, I was given the opportunity to create a rotoscope animation.

Music by Levell C. Lee
Special Thanks to Deborah Holloway

View Press Release

Watch on Youtube

NM Furniture is Art

National Hispanic Cultural Center Internship
New Mexico Furniture is Art

Design and installation of vitrine, which showcased NHCC’s furniture exhibition.


New Mexico History Museum
The Mother Teal and the Overland Route

Co-Illustrated and co-animated short story for “Wild At Heart: Ernest Thompson Seton” exhibition on display from May 2010 to May 8, 2011.

Cabrini Martinez, co-illustrator and co-animator

Watch on Vimeo