Beacons for Accessibility

I have been working on a beacon based app for visitors with blind/low vision accessibility needs as a pilot project for Acadia National Park. The overall goal is to allow visitors to follow along tour path of 9 stops. To keep track of the visitors position, I am using the Estimote Indoor location beacons.

screen shots:

 

Touch Table Interactive

I have been working on two 55′ inch touch table interactives for the Anderson-Abruzzo Albuquerque International Balloon Museum and their upcoming exhibit on S.A. Andree’s failed attempt to get to the North Pole via hot air balloon in 1897.

Early Prototype screenshots:

The first touch table will be embedded in a antique table and housed in a space designed as a victorian study. The table will feature vintage diaries that contain content about SA Andree’s expedition plan and other polar expeditions at the time.

The second table will be embedded in an iceberg shape and housed in a space referred to as the Ice Room. The table will feature content based on the events of SA Andree’s flight and artifacts found at the crews final camp on White Island.

The framework for both interactives have been developed with Hammer.js, an open-source javascript gesture library and NW.js, a tool which packages and distributes node.js based web apps as desktop applications.

demo
demo code

Gear VR

Gear VR
Aside from the Gear VR headset, the Innovator Edition package includes a 16GB MicroSD card and adapter, a screen cleaning cloth, and a soft-shell travel case.

20150129_192159

20150129_192329

The Mobile SDK from Oculus can be found here: https://developer.oculus.com/downloads/#sdk=mobile

Virtual Reality

Unlike augmented reality, virtual reality (VR) is a simulated environment based on fictitious or real world locations. Virtual reality can also allow users to have a virtual presence in simulated environments. VR requires specialized hardware at a high cost for its head-mounted display and head tracking.

In 2012, the Oculus Rift released the Development Kit 1 as one of the first VR headsets of its kind. The Oculus Rift HMD is powered by a computer via USB and includes sensors such as an accelerometer, magnetometer and gyroscope. With an SDK and support for Mac, Windows, and Linux, the DK1 placed inexpensive VR in the hands of developers and gamers. Two years later, Oculus released Development Kit 2 while other HMDs aim to compete.

Google later launched Cardboard, a more accessible and inexpensive HMD. Cardboard allows any Android mobile device running the appropriate software (Jelly Bean and above) to simulate a VR experience similar to the Oculus Rift. However, most Android devices still have insufficient performance and displays that are not equipped to handle low latency VR.

When it comes to latency, I am referring to the display’s response time for head tracking. To make a VR experience as immersive as possible, every millisecond counts. 20ms or less is the ideal level of latency for reduced motion blur induced from head movement, enabling more realistic scenes with richer colors and contrast. Anything higher will have a noticeable lag. High latency can lead to a detached experience and contribute to a user’s motion sickness or dizziness.

Gear VR is the collaborative effort by Samsung and Oculus, and a cross between the Oculus Rift and Google Cardboard. The display is the Samsung Galaxy Note 4, a compatible android mobile device equipped with a ‘Quad HD’ display at 2560×1440 pixels per inch. This is the highest resolution screen seen on anything approaching a consumer VR headset. The OLED display also enables two key elements: a near-instant pixel switching time and the ability to produce truly black pixels. The mobile device is connected to the Gear VR via micro usb. While the display is powered by the mobile device, the Gear VR headset itself has built in Oculus Rift head-tracking module enabling more accurate and lower-latency tracking than Cardboard or other headsets that solely rely on standard mobile sensors.

vr comparison chart

Cystic Fibrosis Cough Monitor

Cystic Fibrosis is a recessive genetic disorder that affects the lungs, as well as pancreas, liver, and intestine. The CF gene produces a mutant protein that interferes with cells’ ability to manage chloride. This causes secretions throughout the body to thicken, and turn dry/gluey. In the ducts of the pancreas, the flow of digestive enzymes becomes blocked, making a child less able to absorb food, which leads to poor growth. However, the effects on the lungs are what make CF lethal. Thickened mucus will slowly fill the small airways of the lungs and harden, thus shrinking the lung capacity.Typically, a child with CF has to be monitored daily for energy levels, appetite, and cough frequency. This data is useful in determining treatments.

The Little Devices Lab at MIT focuses on the design, intervention, and policy spaces for DIY health technologies. By developing a series of medical technology prototyping kits, the Lab has provided new materials and “plug-and-play” components to help nurses and doctors expand their ideas and solutions for better patient care.

The Lab was approached by a parent/nurse to develop a wearable sensor to measure and record her child’s cough frequency. Ultimately, the wearable sensor would be able to transfer the data to a smart phone (or some other means for storing data) enabling physicians and parents to keep track of a child’s health.

Through the opportunities of the MSSD program, I was able to prototype the wearable cystic fibrosis cough monitor for the Little Devices Lab using open-source electronics and software.

My initial tests were to determine cough occurrences based on movement and sound. I began the project using an Arduino, as well as two breakout boards for accelerometer and microphone data. However, these components are bulky. When it comes to wearable electronics, size and power matter. Since the CF monitor is intended to be a wearable device, I researched other microcontrollers. The relatively small Flora and can be sewn into fabric and washed. It also makes use of rechargeable micro-lipo batteries. With the onboard 3.3v 150mA regulator, the Flora can power most common modules and sensors. I then migrated my tests from the Arduino version to the Flora.

CF Monitor

For this prototype, I used a velcro strip as an armband to hold the components.

presentation

Metaio Tutorial

Creating a New AR Project with Metaio and Unity

1. Create a new Unity project.
Right click in the Project panel and select Import Package→Custom Package to import metaioSDK Unity package.

2. Add a user layer
Click on Layers drop down at top-right corner of the Editor and select Edit Layers. Add a new layer in next available User Layer, e.g. call it “metaioLayer”.

3. Set Application Signature
After creating a signature, you could start creating your own application. Delete Main Camera object created by default and instantiate a metaioSDK prefab by dragging and dropping it into hierarchy view of your project. Click on metaioSDK to edit the signature in the inspector window.

4. Add Tracking Config
Create folder within StreamingAssets folder and include tracking configuration with respective patterns.
Select metaioSDK in the Hierarchy panel and then in the Inspector view select StreamingAssets in a drop down menu Select Configuration. Then you could simply drag and drop your tracking configuration file to the Inspector view.

5. Binding Unity game objects with tracking data
Instantiate a metaioTracker prefab by dragging and dropping it into hierarchy view of your project. Metaio Tracker instance is a reference in your Unity project to a corresponding COS in your tracking configuration file. Note that you have to set Coordinate System ID that corresponds to a valid COS in the currently set tracking data. Once it has been done, you could attach new game objects to metaioTracker instance.

6. Build and Run for Platform
Check Player settings and set application signature before building and running.

iOS
Manually add the following libraries and frameworks in Xcode:
…/Project Folder/Libraries/metaioSDK
libxml2.2.dylib
libc++.dylib
Security.framework
CoreImage.framework

If using Metaio SDK below 6.0, you must modify the sensors.h, sensors.m, and .plist for GPS support.

Screen Shot 2015-03-23 at 4.53.10 PM

Screen Shot 2015-03-23 at 4.52.56 PM

Screen Shot 2015-03-23 at 4.56.37 PM

Android
If Android device was plugged into the computer via usb, Build and Run in Unity will place the android package file on your device for you. Otherwise, build a development build of the Unity app for Android Studio.

Metaio Example

For my thesis, I have begun developing a cross-platform augmented reality app with the Metaio SDK and Unity3D.

Metaio SDK

Metaio stands out from the other SDKs as it offers a variety of AR solutions. The free watermarked version of Metaio SDK is currently supported on Android, iOS and Windows with an additional plugin for development in Unity3D for Android, iOS, Windows and OS X platforms. There is also support for Google Glass and Epson Moverio BT-200 glasses. To remove the watermark on applications, you must purchase a Metaio license.

An alternative to the SDK is Metaio Creator, an augmented reality software that allows users to create a complete AR scenario without specialized programming knowledge through a drag and drop interface. Additionally, Metaio is the creator of Junaio, a free mobile AR browser available for Android and iOS devices. Junaio allows users to experience mobile augmented reality through multiple channels on their mobile devices. Another solution for creating interactive Augmented Reality is AREL (Augmented Reality Experience Language). The scripting is based on HTML5, XML and JavaScript. Arel can also be used for creating Junaio channel content.

Unity
Unity is a development platform for creating 2D/3D games and interactive experiences across multiple devices, such as iOS, Android, Desktop, or Game consoles. Below are some steps for setting up the Metaio Example project in Unity.

Getting Started

Sign up for a Metaio account to download the SDK. Download the latest version of Unity. You can also sign up as a unity developer here.

Application Identifier and Signature

Each application needs a unique signature. The Application Signature refers to a string that matches to the Application Identifier. If the Application Identifier is changed, the corresponding Application Signature has to be changed as well.

If registered as a Metaio developer, login to create an app signature under my apps tab.
Screen Shot 2015-03-25 at 12.10.14 AM
Metaio SDK signature

Example App in Unity

1. Start Unity and go to File->Open Project…, select the following directory: \_Unity\Example

2. Open Unity Example Project, go to File->Build Settings and select desired platform. Unity will automatically re-import required assets according to the platform.

Screen Shot 2015-03-25 at 12.11.10 AM

3. Set Application Identifier and Signature
Select Player settings to edit the Bundle Identifier.
Select MetaioSDK from the hierarchy and edit the SDK signature appropriately.

Screen Shot 2015-03-19 at 6.23.19 PM

Screen Shot 2015-03-18 at 1.10.32 PM

4. Click Build and Run
For IOS, open the XCode project exported from Unity. The SDK and some of its dependencies have to be added manually to the Xcode build settings. Go to Build Phases and add the following frameworks and libraries:

…/Example/Libraries/metaioSDK
libxml2.2.dylib
libc++.dylib
Security.framework
CoreImage.framwork

For Android, there are two options: Export a development build of the Unity app for Android Studio or plugin your Android device via usb and Build and Run to install the android package file directly to the device.

Note that you don’t necessarily have to test on a mobile platform – you can also just hit the Play button to run the examples directly in Unity Editor. Choose the platform in the Build Settings menu.

Augmented Reality

Augmented Reality (AR) allows for interaction with the physical world. Any element in the real world can be augmented by sound, video, image, or 3d object based on GPS data and image or marker recognition. These augmented elements are typically overlaid on a live camera feed via webcam or mobile device and now wearables such as “smart goggles/glasses”. There are many Software Development Kits (SDKs) available that provide tools and libraries to easily develop AR applications on multiple platforms, such as iOS, Android, Windows Mobile, BlackBerry, or Desktop (OSX, Windows, Linux). With that said, each SDK has its own features and limitations for its intended platforms.

Below is a list of features provided by the leading AR SDKs.
AR

Interactive Haunted Hallway

This Halloween I was tasked with designing a few themed interactives.

CHOOSE YOUR LEVEL OF FEAR

I came up with a couple of ideas, one being a choose your own adventure style interactive where participants can select a level of fear they are comfortable with. On a scale from 1 to 5 (1 being the lowest), participants are unknowingly selecting a possibly creepy halloween themed gif to be projected on a scrim in the hallway they will walk through.

SETUP

Using foam board, I constructed a basic control box to house all the electronic components.

_1150675

The arduino is connected to breadboard with LEDs, potentiometer, and servo motor.

_1150684

The arduino is also connected to a macbook with long HDMI cable connected to pico pocket projector mounted in the hall.

_1150708

_1150712

MOTORIZED SPIDERS

I planned to have spiders drop down from the ceiling when movement is detected. I started prototyping with servo motors but in order to wind up the spider in both directions, I then moved onto a couple DC toy hobby motors from Adafruit. However, working with DC motors was quite the challenge. There was so much troubleshooting involved and the hobby motors I got turned out not to be ideal. I had purchased a few different sized halloween spiders, but the motors torque and size could not handle the weight of any of the spiders. Due to this issue, I constructed some spiders out of pipe cleaners. Eventually, everything worked.

SETUP

Again, I constructed another foam board box to be mounted to the ceiling/wall and house all the electronic components.

_1150812

A PING sensor is long enough to be mounted to the side of the wall. This will be triggered when participants walk by.

_1150820

Watch this video to see the interactives in action.

For further documentation of the Haunted Hallway project, feel free to download this PDF.

Stretch Sensor

Conductive Rubber Cord (Stretch Sensor)

Similar to a thermistor, the program measures the analog voltage and converts that back to resistance. When stretched, resistance is detected and the conductive rubber cord lights an LED.

stretchsensor-640x463

// Sensor pin - GND
// Sensor pin - Analog In 0, with 10K resistor to +5V
int LedPin = 13;    // LED connected to analog pin 13
int SensorPin = A0;    // Sensor connected to analog pin A0
 
 
void setup()
{
    // initialize serial communications
    Serial.begin(9600); 
}
 
void loop()
{
    // read the voltage from the voltage divider (sensor plus resistor)
    int sensor = analogRead(SensorPin);
    
    /*
    analog range to output
    map(value, fromLow, fromHigh, toLow, toHigh)
    value: the number to map
    fromLow: the lower bound of the value's current range
    fromHigh: the upper bound of the value's current range
    toLow: the lower bound of the value's target range
    toHigh: the upper bound of the value's target range
    */
    
    int output = map(sensor, 30, 70, 0, 255);
    
    // print out the result
    Serial.print("analog input: ");
    Serial.print(sensor,DEC);
    Serial.print(" output: ");
    Serial.println(output,DEC);
    
    analogWrite(LedPin, output);    
 
    // pause before taking the next reading
    delay(100); 
}

Conductive Rubber Cord from Adafruit

20140829_144512

20140829_144254

20140829_144531

Colorado National Monument

Over the last few months, I’ve been researching and developing an augmented reality iOS app for Colorado National Monument in Grand Junction, CO. The key feature of the app allows visitors to stop at augmented waysides in the park.

Moisture Sensor

If It Rain, It Tweets!

Sensor + Arduino + Xbee transmitting data to Raspberry Pi + Xbee that tweets if moisture has been detected.

How does it work?

The moisture sensor is an Arduino connected to Xbee Adaptor and a printed circuit board with 555 timer. Leads are connected to a sponge that send resistance data to the circuit board. Arduino reads the data, adds the time and transmits the data to an Xbee connected to a Raspberry Pi. The Raspberry Pi sends out a Tweet with moisture status, e.g., “Extreme Moisture”.

Materials

Xbee (transfer/recieve), breadboard, leads, sponge, Arduino Uno, Raspberry Pi, USB wifi dongle, printed circuit board for 555 timer to take moisture readings

Moisture sensor using a 555 timer IC (Express PCB)

A way of quantifying the amount of moisture in a porous medium is to measure its electrical resistance. Typically, the higher the water content lower its resistance. Under most circumstances the resistance is in the mega-ohm (million ohms) range. As a reference a 100 feet of copper wire that would be used in house wiring has the resistance of a few Ohms at most. A moist medium has one-million times more resistance. We wanted to make a sensor that will give a number that is compatible with the kinds of data that an Arduino can comfortably handle, that would be an integer and between 1 and 20,000 or so.

One way of doing this is to make an resistor-capacitor (RC) circuit we could somehow charge up the capacitor and let it discharge through the moist resistive medium. A common 555 timer integrated circuit chip (IC) uses an RC circuit for its time base. We have exploited this feature to make a moisture sensor that can be triggered and read by an Arduino. The 555 can be configured in a number of ways for this application, we chose to use as a monostable-multi vibrator, a one-shot. Send the circuit a trigger pulse and the 555 will produce an output whose duration is determined by the values or the resistor and capacitor used for its time base. In this case the resistor will be the moist medium and we can choose a capacitor that will produce a pulse of 100’s or a few thousand milliseconds long (0.1 sec to a few seconds).

We tested it with damp paper towels and sponges to determine that a 1uF capacitor works well giving a pulse out of about 500-1000 milliseconds. We decided to measure moisture in milliseconds. The shorter the time, the higher the water content of the medium. Since we were going to deploy a number of sensors we designed a small printed circuit board for the sensor sent our design off to expresspcb.com to make the bare board.

555 moisture sensor schematic

555 moisture sensor schematic

555 moisture sensor

Arduino + Xbee  transmits to Raspberry PI + Xbee

Each setup has its own script for receiving or transmitting data.

arduino

xbee

moisture sensor

Initial Raspberry Pi Set Up

To streamline the development process, we set up remote access via ssh. To complete this, we connected a usb wifi dongle to the Pi. The wifi connection is also important to the set up as we need internet access for the twitter implementation.

Serial Communication with the Raspberry Pi

During development, we researched Raspberry Pi GPIO and USB serial communications. One method for accessing the the GPIO pins is through UART, or universal asynchronous receiver/transmitter. UART is the piece of hardware that translates data between parallel and serial forms. By default, the the Raspberry Pi serial is configured to use the UART for console input and output. However, that means the Serial Port could not be used in our program. In order to access the dedicated UART pins on the raspberry pi, the serial ports needed to be reconfigured. After a few modifications, we can then receive communications. We later came to the decision to make use of an FTDI usb cable connected directly to an Xbee receiver.

We chose to use pySerial, a module that allows for access to the serial port in your python script. Like other python modules, pySerial needs to be installed and then imported in your program.

serial

The serial data created by the moisture sensing circuit board is transmitted from the Arduino to Raspberry Pi between Xbees via radio frequency.

raspberry pi + xbee

The Pi uses pySerial to accept that data. It begins by importing data from the specified USB port. Each Xbee that is transmitting data has been given a specific numerical name that is converted into a readable ID of Xbee 1 or Xbee 2 when its presence is detected.

xbees-data

Based on the values we were seeing in our tests, we specified any value of 3500 or above to mean that there is no moisture detected. A moisture status message is sent to the console and Twitter every time data is received. A value in the range of 700-1499 has a moisture status of “Extreme Moisture”. Likewise, 1500-2249 is “Medium Moisture”, and 2250-3500 is “Minimum Moisture.” Each time moisture is detected, the program resets itself to false and waits for the new data to be received.

data-values

Twitter Implementation
To access the Twitter API via Python, we chose to use Tweepy, the well known Twitter for Python library. In order to set up Authentication keys, we signed up for the Twitter Developer account.

tweepy

moisture-tweets

Pi Beacon

Rasp Pi

What are Beacons/iBeacon?

Beacons are hardware that use Bluetooth Low Energy to advertise a Universally Unique Identifier (UUID) to devices, like a smartphone. iBeacon is Apple’s protocol for accessing the UUID data advertised by a Beacon. This technology allows for proximity sensing and indoor  positioning. For example, a device can alert a user if they are in range of a beacon and enable information based on that location.

Beacon hardware, such as Estimote or Gimbal, are fairly inexpensive. However, it is also really simple to build your own with a Raspberry Pi.

Resource: How to make a Pi Beacon from Radius Networks

Raspberry Pi Beacon transmitter

Using a Bluetooth LE Dongle connected to a Raspberry Pi and a script to transmit a UUID, the Raspberry Pi can act as a Beacon.

Bluetooth Dongle

Resource: Installing Bluetooth on a Raspberry Pi from modmypi.com

To advertise the UUID on startup, I ran the script from /etc/rc.local

edit /etc/rc.local command

~$ sudo nano /etc/rc.local

While inside the editor, modify /etc/rc.local so that the raspberry pi can navigate to the directory of the beacon script and run it.

cd /home/pi/PiBeacon
sudo bash startBeacon

startBeacon script
startbeacon

stopBeacon script
stopbeacon

 

iOS Beacon Receiver app
After reading the iBeacon documentation and researching the Estimote SDK, I developed a sample application that displays a different clickable icon based on the level of proximity to my Raspberry Pi Beacon. Depending on which icon is clicked, different content is displayed.

 

 

Levels of proximity:

Pi Tweet

Wifi
Install wifi dongle for internet access on the Pi

Twitter/Tweepy
Tweepy is a Python library for accessing the Twitter API. In order to use the library, I first had to set up Authentication keys by signing up for the Twitter Developer account.

raspberry pi tweet

sample python tweepy script:
Tweet.py

#!/usr/bin/env python

import sys
import tweepy

#api keys from twitter
CONSUMER_KEY = 'XXXXXXXXXX'
CONSUMER_SECRET = 'XXXXXXXXXX'
ACCESS_KEY = 'XXXXXXXXXX'
ACCESS_SECRET = 'XXXXXXXXXX'

auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
api = tweepy.API(auth)
api.update_status(sys.argv[1])

terminal command:

python Tweet.py "Hello, from my Pi"

Pi Photobooth

I connected a picam and a thermal printer to a raspberry pi. When you press a button, the camera will take a picture and immediately print the image.

Materials

Raspberry Pi
PiCam
Mini Thermal Receipt Printer
breadboard
jumper wires
10k resistor
button

Thermal Printer Tutorials

Little Box of Geek Project from Geek Gurl Diaries
Part 1
Part 2

Picam Tutorial

Python PiCam Setutp from Raspberry Pi

20140804_120601

20140804_121103

20140708_132759

20140708_135353

20140716_121919

Interactive Marionettes

Gustave Baumann was born in Germany and emigrated to the United States at the age of ten. While he is known as an American artist, primarily as a printmaker, his marionette skills were deeply rooted in his German heritage.

The New Mexico Museum of Art in Santa Fe owns a collection of over seventy marionettes carved by artist Gustave Baumann during the 1920s and 30s. Their age and fragility means that they are rarely displayed and never used as designed. Figuring out a way to surface this collection and to encourage an interactive experience has been a goal for the collection’s curators for a long time.

The museum approached the development team from Media Arts & Technology at New Mexico Highlands University. In order to allow visitors to interact with a rarely seen collection, five of these unique marionettes have been scanned via photogrammetry. Using current technology, the scans have been repurposed to demonstrate the articulation of the marionettes.

The project involves the Microsoft Kinect hardware and software written for Unity3d to allow the user to embody a marionette. I recently prototyped another version which uses the Leap Motion Controller hardware and Javascript to provide the user with the fine-tuned experience of controlling a marionette’s movements.

The Leap Motion Controller has a significantly finer range than the Kinect and is an ideal solution to complete the user experience. Using the Javascript framework for Leap, marionette data is attached to ragdoll physics bodies which respond to fine movements of the user’s fingers and hands. The idea was to mimic the control planes used by traditional puppeteers, but I discovered the Leap hardware has some sensitivity issues sensing the roll of a user’s hand. Pitch and yaw are easily readable, but this would give an incomplete experience, so I modified the metaphor to give the feeling of controlling the strings directly with the user’s fingers.

The Baumann Marionettes Interactive will be included in the exhibit “Gustave Baumann: A Santa Fe Legend”, which opened at the Las Cruces Museum of Art in February 2014.

demo site: lpb-riannetrujillo.com/marionettes

Leap Experiment

I’m interested in gesture based interfaces and have been learning about the Leap Motion controller, which is a device that senses hands, fingers and multiple gestures, allowing you to interact with content in a new way.

Leap Motion controller

Leap Experiment

Below is some source code I wrote for an example that tracks a color changing circle to your fingers. If you have a Leap, check out the demo.

index.html

main.js

document.addEventListener("DOMContentLoaded", init);

var canvas;
var ctx;

function init() { 
	
	canvas = document.getElementById("leap-canvas");

	// fullscreen
	canvas.width = document.body.clientWidth;
	canvas.height = document.body.clientHeight;

	// create a rendering context
	ctx = canvas.getContext("2d");
	ctx.translate(canvas.width/2,canvas.height);

	//change color of pointable at interval
	setInterval(color, 50);
	
	
	// listen to Leap Motion
	Leap.loop(draw);
	
}

function color() {

	//rgb color
	this.r = Math.floor(Math.random() * 255);
	this.g = Math.floor(Math.random() * 255);
	this.b = Math.floor(Math.random() * 255);
	
    ctx.fillStyle = "rgb("+ this.r +", "+ this.g +", "+ this.b +")";
    
}

// render each frame
function draw(obj) {
  // clear last frame
  ctx.clearRect(-canvas.width/2,-canvas.height,canvas.width,canvas.height);
  
  // render circles based on pointable positions
  var pointablesMap = obj.pointablesMap;
  for (var i in pointablesMap) {
    // get the pointable's position
    var pointable = pointablesMap[i];
    var pos = pointable.tipPosition;

    // create a circle for each pointable
    var radius = Math.min(600/Math.abs(pos[2]),20);
    ctx.beginPath();
    ctx.arc(pos[0]-radius/2,-pos[1]-radius/2,radius,0,2*Math.PI);
    ctx.fill();	

   }	

}

Download from Github

Wii Paint

Screen Shot 2014-04-14 at 10.59.57 PM

Wii Paint is a desktop application I wrote in Objective-C. It works in use with a Wii Remote, a battery operated IR Sensor Bar, and DarwiinRemote, an application that maps keyboard keys to Wii Remote buttons.

wii sensor

Wii Remote

Keyboard Keys for WiiPaint Desktop application:
n is for new canvas (clears the screen)
b goes back in the array of colors
c goes forward n the array of colors

Using IR mode will map mouse movements to Wii Remote movements with an IR Sensor.

Darwiin Preferences:
Darwiin controls

Emergence Interactive Timeline


THE TIMELINE
Life’s origins is represented as series of events depicted through earth’s geosphere and biosphere. The exhibit timeline also includes specimens and illustrations of early earth, as well as the formation of earth.

THE PROJECT
Design and develop an interactive version of the exhibit timeline.
Emergence Interactive Timeline
KEY COMPONENTS
text (bio/geo content)
images (earth illustrations)
audio/video (scientist interviews)
easily up-dateable (by museum staff)
designed for iPad (for use in exhibit)
accessible on the web (for use in classroom)

DESIGN & DEVELOPMENT
Use of the following:
HTML5/CSS3 + Canvas XML
Javascript
fancybox + jQuery

THE CHALLENGES
The overall challenge of this project has been the fact that it is still early in the HTML5 adoption cycle and there is limited documentation on Canvas implementation.

ISSUES
– Re-sizing based on browser window
– Click Coordinates
– Drawing Video on iPad

SOLUTIONS
– Cross-browser/platform Testing
– Use Open Source Frameworks/Libraries/Tools (fancybox)

Demo Site (best viewed in Chrome Browser): lpb-riannetrujillo.com/emergence

Media Queries

Below is a short example of my use of CSS media queries for the Alcoves 12.0 responsive website.

alcove-desktop

iphone-version

Navigation HTML

Navigation CSS

nav {
    width: auto;
    height: auto;
    margin-top: -80px;
    margin-right: 40px;
    padding: 0;
    float: right;
    /* background */
    background: #404c5f;
}

nav ul {list-style: none;}

nav ul li {
	float: left;
	position: relative;
}

nav ul li a {
	font-weight: bold;
	color: #fff;
	display: block;
	padding: 10px 20px;
	text-decoration: none;
}

nav ul li a:hover {color: #7791ae;}
nav ul li:hover, active {background: none;}
nav  ul li ul { display:none;  background: #4c617a; }
nav  ul li:hover ul { display:block; position:absolute;}
nav  ul li ul li {color:#fff; width: 180px; display:inline-block; float: left;}
nav ul li:hover ul li {background: none;  }
nav ul li:hover ul a {padding: 5px 20px;}
nav ul li:hover ul a:hover {background: #647d99; }


@media screen and (max-width: 720px) {
  nav {
	width: 100%;
	height: auto;
	margin: 0;
	padding: 0;
        position: relative;
	/* box shadow */
	-webkit-box-shadow: inset 0 1px 0 rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);
	-moz-box-shadow: inset 0 1px 0 rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);
        box-shadow: inset 0 1px  rgba(255,255,255,.1), 0 0px 1px rgba(0,0,0,.4);
    }

    nav ul li a {
	font-weight: bold;
	color: #fff;
	display: block;
	padding: 10px 40px;
	text-decoration: none;
    }
    
    nav ul li a:hover {color: #fff;}
    nav ul li:hover, active{background: #58595b;}
}

AmeriCorps 2012-2013


I’m currently in my third term of the program interning at The New Mexico Museum of Art, as well as the New Mexico Museum of Natural History & Science.

At The New Mexico Museum of Art I was tasked with creating a gallery kiosk to showcase artists from the Alcove Shows. Alcove 12.0 included the work of 45 artists from across the state of New Mexico. Five new artists were exhibited every five weeks. The Kiosk would allow visitors to preview work from previous weeks. Throughout the remainder of the internship, I programmed various gallery interactives to include additional videos and/or photos for exhibits. I also designed templates to be used for events, as well as a redesign and style guide for creating  email blasts.

While at the New Mexico Museum of Natural History & Science, I was able to revisit an exhibit I worked on as a student in 2011. As part of National Science Foundation funding, Emergence: A New View of Life’s Origins  exhibit was to provide educational resources and teacher materials. I redesigned the Emergence website and included a resources section. I also designed promotional materials. Promo materials and resources that could be placed on custom USB drives were mailed to science education teachers in New Mexico.

Overall, I think being able to gain experience in various institutions has prepared me for future employment.  My experience with AmeriCorps has been invaluable.

MW 2013

Last week, I attended Museums and the Web in one of my favorite cities, Portland. I served as a conference volunteer, as well as a representative for NMHU and AmeriCorps. It was a really great experience being able to network with others in the museum and technology field.

Here is a short video I compiled after my trip.

C++ Black Jack

Below is a Black Jack text based game I developed in C++.

Screen Shot 2013-06-15 at 8.22.29 PM
blackjack.cpp file

//============================================================================
// Name        : blackjack.cpp
// Author      : Rianne Trujillo
// Version     :
// Copyright   : 
// Description : Black Jack in C++
//============================================================================

#include 
#include 

#include "dealer.h"
#include "player.h"

using namespace std;

int main() {

	//display game title
	cout << "BLACKJACK\n" << endl;

	//player class + human instance
	Player human;
	//if playerName is set to default, ask for name
	if (human.name=="default") {
		std::string playerName;
		cout << "Enter Your Name: ";
		cin >> playerName;
		human.name = playerName;
		cout<<endl;
	}

	//Print Greeting
	cout << "Welcome " << human.name << ". Let's Play!\n" << endl;

	human.newGame();

	return 0;

}

player.cpp file

/*
 * player.cpp
 */


#include "player.h"
#include "dealer.h"

Player::Player () {
	name="default";
}


void Player::play21(void) {

	//randomize the cards
	srand((int) time(0));

	//classes+instances
	Dealer dealer;
	Player player;

	// deal the cards
	int person = dealer.dealCards(2, "Your Cards:");
	std::cout << " = " << person;
	std::cout << "\n";
	int computer = dealer.dealCards(2, "Computer's Cards:");
	std::cout <<" = " << computer;
	std::cout << "\n";
	// Ask if human wants a hit
	//player.hit(person);

	char takeHit = 'y';
		while (takeHit != 'n') {
			if (person < 21) {
				//does the player want a hit?
				std::cout << "\nDo you want a hit (y/n)? ";
				std::cin >> takeHit;
				//If yes
				if (takeHit == 'y'){
					//Deal a card
					person += dealer.dealCards(1,"\nHit:");
					std::cout << "\n";
					//total
					std::cout << "Total: " << person << "\n";

				} else //the player does not want another card
				takeHit ='n';
			} else {
				//the player has busted
				if (person > 21)
				std::cout <<"\nYOU BUSTED!\n";
				takeHit ='n';
			}
		}

	//Determine if computer takes a hit
	while ((computer < person) && (computer <= 21) && (person <= 21)) {
		std::cout << "\n";
		computer += dealer.dealCards(1,"The Computer took a card: ");
	}

	//show who won.
	dealer.determineWinner(person, computer);

}

void Player::newGame(void){
	char keepPlaying;
	//set variable for keepPlaying char to n
	keepPlaying = 'n';

	do {//do play21 function while keepPlaying = y
		play21();

		//ask for keepPlaying input
		std::cout << "\nDo you want to play another hand (y/n)?";
		std::cin >> keepPlaying;
		std::cout << "\n";

	} while (keepPlaying=='y');

	if  (keepPlaying=='n')  {//if no, print game over
		std::cout << "\nGAME OVER.\nThanks For playing!";
	}

}

player.h file

/*
 * player.h
 */

#ifndef PLAYER_H_
#define PLAYER_H_

#include 
#include 

class Player {
	public:
	// properties:
	std::string name;
	void play21(void);
	void newGame(void);
	Player();

private:
	std::string playerName;

};


#endif /* PLAYER_H_ */

dealer.cpp file

/*
 * dealer.cpp
 */

//dealer  shuffles cards and
//gives card to players when they ask for hit

#include "dealer.h"
#include "player.h"

Dealer::Dealer() {
	personScore = 0;
	computerScore = 0;

}


int Dealer::dealCards(int numCards, std::string message){
	//deal cards
	//set cardDealt and totalValue to 0
	int cardDealt = 0;
	int totalValue = 0;
	//print players cards to the screen
	std::cout << message << " ";
	//deal the number of required cards
	for (int i = 1 ; i <= numCards ; i++){
		//deal a card between 1 and 10
		cardDealt = Shuffle(1, 10);
		//if card dealt is equal to 1
		if (cardDealt == 1){
			//and if total value of card dealt is less than 10
			if (totalValue+=cardDealt > 10)
			//card is 11
			cardDealt =11;
			//card is 1
			else {cardDealt =1;}
		}

		//accumulate the card values
		//totalValue is equal to the number of cards dealt
		totalValue += cardDealt;
		std::cout << cardDealt << " ";
	}
	//return total value
	return totalValue;
}


void Dealer::determineWinner(int person, int computer) {
	this->computerScore=computer;
	this->personScore=person;

	//Display total scores
	std::cout <<"\nYour Score: " << person;
	std::cout <<"\nComputer Score: " << computer;
	std::cout << "\n";

	//Display winner
	//if person is equal to computer, its a tie
	//if person = 21, or => computer, or computer is > than 21, person wins
	//else computer wins
	if (person == computer)
	std::cout << "\nTie";
	else if ((person == 21 || person >= computer|| computer > 21) && (person <= 21))
	std::cout <<"\nYou Won!\n";
	else
	std::cout <<"\nThe Computer Won!\n";
}

int Dealer::Shuffle(int lowerLimit, int upperLimit) {
	//returns a random number within the given boundary
	return 1 + rand() % (upperLimit - lowerLimit + 1);
}

dealer.h file

/*
 * dealer.h
 */

#ifndef DEALER_H_
#define DEALER_H_

#include 
#include 

class Dealer {
public:
	int dealCards(int numCards, std:: string message);
	void determineWinner(int person, int computer);
	int Shuffle(int lowerLimit, int upperLimit);
	Dealer();

private:
	int personScore;
	int computerScore;

};

#endif /* DEALER_H_ */

Americorps 2011-2012

When my previous Americorp internship ended in August 2011, I decided to re-enroll for a full time position. In October, I continued interning at the same institution, while splitting time at the New Mexico Department of Cultural Affairs. Over the course of a year, I was able to continue utilizing and developing my skills.

At The Albuquerque Museum of Art and History, I was able to design and develop an iOS App that provided artwork and artist information for the ISEA2012 Exhibition installed at the Albuquerque Museum of Art & History.  I was also given the opportunity to design posters, brochures, and educational booklets. During Albuquerque’s centennial, I created a scavenger hunt using SCVNGR. I designed signage for how to play as well as a recognizable stickers to be placed near Artworks related to the scavenger hunt.

At the New Mexico Department of Cultural Affairs, I designed and developed the backend for the Office of Archaeological Studies Pottery Typology Project Website. The admin side interface allows users to enter pottery based types into a database. I also provided content management and developed registration forms for the New Mexico Association of Museum Annual Conference.

Americorps 2011

Sculpture Garden App

After graduating from New Mexico Highlands University in May 2011, I enrolled in the AmeriCorps Cultural Technology program. Soon, I began a minimum time position as an intern at The Albuquerque Museum of Art and History. My scope of work consisted of recording and editing audio, as well as designing and developing a mobile audio tour of the museum’s sculpture garden.

PICT

PICT: Program for Cultural Technology
New Mexico Museum of Natural History & Science
Emergence: A New View of Life’s Origin exhibit

Collaborated on the design and development of “Emergence: A New View of Life’s Origin”, a permanent exhibit of the New Mexico Museum of Natural History & Science.

Visit Website