Cornell University

Award Winners for Inaugural Cornell Cup USA, presented by Intel

First Place - IVS, Portland State University
Nowadays, the development of medical care helps to save millions people and cure numerous known diseases. However, the introduction of a substantial number of pills and drugs that follows has made identifying them increasingly challenging and time-consuming. In 2006, out of 100 persons, 46 persons have to come to the emergency rooms (ERs) and averagely spent 2.6 hours [1]. To address the problem, this paper will propose a solution namely Prescription Drug Identification device (PDI) which is capable of minimizing time, increasing accuracy and providing detailed instant drug information to both patients and doctors. The device is designed mainly for ERs but can also be used in doctor offices or at home.

The PDI with camera will capture all images of drugs. It will then extract all drugs characteristics such as imprint code, shape, color, etc through the image processing techniques. These features will be looked up in offline built-in or enormous online database to provide the information needed. Users can also manually go into their database for drug details or feed it with the latest information. A device having such breakthrough features will significantly help doctors in emergency rooms where time and precision are critical factors.

Second Place - HAWK, University of Pennsylvania
We propose the prototyping of a sensor-enabled quadrotor aircraft platform to construct 3D building models to support search and rescue operations. This technology will provide sensor-rich visualization of buildings overlaid with temperature gradients, air contamination, and hazardous zones for more informed rescue strategies. The proposed system has the potential to lower first responder casualties and minimize mission execution time.

The goal of the H.A.W.K. project (Helicopter Aircraft Wielding Kinect) is to build upon a pre-built quadrotor aircraft platform to include a low-cost depth camera (stripped down Xbox Kinect) to conduct rapid Simultaneous Localization And Mapping (SLAM) of a building in a single fly-through. All Intel Atom-powered quadrotors will locally capture the depth and image data and transfer it to the base station for SLAM processing and rendering of a 3D building model. As each craft is deployed with multiple sensors on board, we will incorporate a thermal camera, CO, and SO2 sensor data to overlay thermal gradients and air contamination information over the captured 3D building model. The system incorporates advanced flight controls, runtime UAV coordination algorithms, and low-cost image and depth perception sensors to execute complex SLAM algorithms.

Third Place - Team Squirtle, Massachusetts Institute of Technology
As with all disruptive technologies, personal robotics has been in search of a killer application to truly drive forward technology adoption, growth, and innovation. We believe that one such application lies within the heart of biotechnology in the form of liquid handling robots – huge couch-sized machines designed to reliably and accurately transfer and mix liquids, used in everything from basic chemistry to the latest cutting edge cancer research. Despite such versatility, liquid handlers have traditionally been relegated only to high-throughput automation tasks. However, we believe that their use can extend much farther than that. In the same vein as the evolutionary step from mainframe computers to PCs, our project discards conventional wisdom regarding the traditional design of massive high throughput machines to build a liquid handler that is smaller, lower-priced, and more intelligent – designed around the core idea of allowing individual researchers to advance the frontiers of science with a tool that is faster, more accurate, and more tailored to their personalized liquid handling needs.

People’s Choice - Kinecthesia, University of Pennsylvania
Over 284 million people are visually impaired worldwide: 39 million are blind and 245 million have low vision (World Health Organization, 2011). The goal of our project id to provide a simple, low-cost solution that provides a better aid than walking canes and Seeing Eye dogs for visually impaired people.

With that in mind, we created Kinecthesia – a wearable belt that can detect obstacles and alert the user to their location. We developed a prototype using a Microsoft Kinect for the obstacle detection and an array of vibration motors for the feedback system. When the user approached an object while wearing the Kinesthesia, the belt subtly vibrates. The intensity and location of the vibrations on the belt tell the user exactly where the obstacle is. Using this system, the user can feel their surrounds and navigate around stationary or moving obstacles.

This system also has applications beyond obstacle avoidance for the blind. It can be used by anyone in a low visibility situation like firefighters or miners. Additionally, the hardware could be re-purposed to be used as a navigation system to direct a user towards as destination rather than away from an obstacle.

Wild Card - Blind Assist, Howard University
According to the National Federation of the Blind, “The real problem of blindness … is the misunderstanding [of their surrounding]”.

As a solution to this issue, this team proposes a device that assists the blind in navigating their surroundings. The device will function as a GPS guidance and obstacle avoidance system. We can offer this functionality though components including a sonar sensor, GPS receiver, and an Internet modem. Our avoidance system will use sonar to detect objects to protect users from collisions with obstacles. The sonar sensor will be connected to a microcontroller that sends data back to the Tunnel Creek Board for processing. The system will then send a command to the user alerting them of upcoming obstacles.

There are no widely available devices that take advantage of sonar in order to protect users in this fashion. The guidance system will bring GPS functionality to persons whom were previously denied access due to the nature of their disability. Users will send voice commands (i.e. “Go to Address”) to the Bluetooth transceiver which relays the command to software on our Atom Board.  Our software gets data from the internet modem and GPS receiver and returns navigational instructions.


Honorable Mention

Interactive Multi-Control Wheelchair, Worcester Polytechnic Institute
The aim of this project is to instrument a wheelchair with an intuitive control and navigation system that integrates voice recognition, face tracking, and hand gesture interpretation. It allows the user to easily select his or her preferred method of control depending on situational demands or personal needs. This robotic wheelchair will use the Intel Tunnel Creek platform and the Atom processor to perform necessary computations. The system will actuate the power wheelchair base, determine when the user is controlling the robot and combine multiple interfaces for greater usability.

Current commercial wheelchairs have drawbacks. These systems may not meet real-time requirements or are easily influenced by the environment or unintended actions of the user. They also do not accommodate users who cannot use their hands. No commercial wheelchair has the multi-mode capability of letting the user freely choose from asking the wheelchair to perform tasks orally, with gestures, or by moving his or her head. This project will use affordable commodity hardware to reliably allow people with disabilities who cannot currently use joystick-based power wheelchairs to become mobile and will make the wheelchair a user-friendly and pleasant experience for those who are already using traditional or existing electric wheelchairs.

Team DART, Seattle Pacific University
According to the Humane Society four out of ten U.S. households have a dog. This leads to a common problem and nuisance in today’s society: picking up dog waste from one’s yard. This practice is inconvenient and requires precious time out of every dog owner’s day. There are numerous companies that specialize in services dealing with the removal of dog waste, but these services are expensive and can quickly add up over time. The average dog may produce waste several times per day! Our team’s proposed solution to this problem is the AWR (Autonomous Waste Remover).

The AWR will automatically navigate your lawn and remove dog waste daily. The AWR will be a wireless and battery powered vehicle that will incorporate a complex sensor system to navigate around the yard while avoiding obstacles, collecting dog waste, and returning to a base station. The base station is where the collected waste can be disposed. The base station will also automatically recharge the AWR power system.

Hot Dawgs, Southern Illinois, University at Carbondale
According to the Department of Energy, in 2005, heating and cooling a house comprised around 49% of all household energy usage (1) The cost of heating and cooling homes is, on average in 2005, over $800 a year (2) The installation of a programmable thermostat can save $180 a year if set properly (3) This is still only one sensor that may or may not be in an ideal location in the home.

Our proposal is to use an Intel Atom processor and board to monitor a network of temperature sensors in each room. From the data gathered the processor will adjust vents in each room to allow optimal cooling or heating of a house on a room by room basis. Our goals are to reduce the cost of heating and cooling a house, to make this system in such a way that it can be retrofitted into existing houses, which would benefit the most, and to provide an easy to use user interface.

Sentinel, University of California, San Diego
Sentinel: The Intelligent Wildlife Video Recording System "Never before seen footage" is powerful because it provides a new perspective on the universe. All of a sudden, the world isn't the same anymore; the universe's dynamic nature is laid bare for all to see.

Though Youtube has effectively documented the complete anthology of human behavior, many animal behaviors remain a mystery. Sentinel, a wildlife video recording system aims to unravel this mystery by capturing the behaviors of elusive and endangered species on film. Currently, video traps either capture video continuously, leaving the end user with days of mostly useless video to try and sift through, or ineffectively capture bits and pieces of behaviors, leaving the user with incomplete observations. Sentinel is an intelligently triggered video trap that senses and sees its environment in order to efficiently capture high definition video of all animals that cross its path.

Solar Drone, University of California, Berkeley
The problem of long-endurance flight has yet to be adequately solved with unmanned aerial vehicles (UAVs). Conventional powertrains are typically disadvantaged by limited capacities for non-renewable fuels, but in theory a renewable energy source would permit self-sustaining flight. However, most formal research to date concentrates on developing aircraft with immense wingspans, while overlooking miniature UAVs due to difficulties with effective downscaling. This team intends to bridge that gap by demonstrating multiple-day flight using an autonomous solar-powered UAV with a wingspan of 3m or less.

Meticulous power-saving techniques make continuous flight practical. Minimizing the mass of the airplane decreases the load on the motor, saving energy. Further efficiency relies on an embedded system that, with minimal overhead, can dynamically tune aircraft behavior for strategic power conservation, such as gliding, disabling non-critical electronics, and adjusting altitude. The objective for the UAV is to maintain a permanent station in the air without depending on persistent human direction.

Many applications would benefit from an inexpensive, autonomous, permanently airborne platform: weather tracking, emergency response communications, and high-altitude scientific research. Its smaller size, lower cost, and fewer maintenance requirements will allow the quick deployment of application-specialized drones in a sky-based network.

SWARM, Columbia University
There are numerous environments that for a myriad of reasons are inaccessible to human exploration. Our project plans on being a way to solve the problem of mapping out these environments by using a heterogeneous swarm of microcontroller controlled robots that autonomously take sensor readings and use this data to build a map of their environment. The exciting aspect of this project is that it is extremely applicable to being used in disaster relief zones, hostile environments and space exploration.

The main challenge of this project is to manufacture numerous low cost robots and incorporate them into a system that is robust enough so that the loss of a robot does not affect the function of the swarm. The overall system will incorporate sensor processing, wireless communications, probabilistic mapping and autonomous robot control. Our solution is unique and exciting in that it segregates the swarm into different roles based on the processing power of each individual robot, enabling our swarm to be more robust and utilize each robot to its fullest.

Audio(G)Fusion, University of Houston
Audio(G)Fusion integrates an Intel processor with an electric guitar. The user may alter audio filters using a touchscreen that is mounted to the body of the guitar. Once the user has configured a filter, they may assign the filter to a preset button to quickly enable or disable it while performing. The user may choose any combination of filters. The maximum number of filters is limited only by the Intel processor’s power.

GT Accessors, Georgia Institute of Technology
Accessibility focuses on the degree to which people with disabilities can interact with the world around them. Unfortunately, most embedded applications (apps) for smartphones and tablets are not designed with accessibility in mind, especially for those with upper-body motor impairments. Imagine therefore the ability to expand access to technology if we provide alternative input interfaces to increase accessibility to tablet-based applications.

One such application area is in rehabilitation for children with Cerebral Palsy (CP). CP is the leading cause of childhood disability that affects function and development. Approximately half of these children sustain dysfunctions in upper-extremity activities. These children therefore tend to have difficulty in accessing devices that require fine motor control, such as the common pinch and swipe gesture operations required for tablet access.

As a partial solution to this issue, this team proposes the development of a unique interface device, TabAccess, which provides wireless access to tablet devices. TabAccess will utilize a sensor system that enables individuals whom lack fine motor skills to provide necessary inputs to control their desired apps. We will also test our interface device with a team-designed Virtual Piano Player based on clinical evidence that supports the use of music for pediatric physical therapy.

The Incredible HUD, Purdue University
The Incredible HUD (the device) is a novel approach to the subject of compact, portable heads-up displays. There is a growing need for compact, rugged and highly integrated augmented-reality displays that provide relevant and real-time information to the user. Be it motorsport, extreme sports, or even defense applications, there is no shortage of applications for a device that can enhance the user’s awareness of his/her surroundings. Current solutions are too expensive, too delicate, too bulky, or too complex for mass consumer appeal. This gives rise to challenges such as optics, weight, power consumption and device stability and durability.

While attempting to overcome the challenges mentioned and coalescing real-time data from a GPS receiver, accelerometer, thermometer and video-camera, the devise presents the data to the user in a flexible format. Telemetry data is logged for review on a computer at a later time. An Intel Aton motherboard provides video-processing and display generation capability. The display itself will allow an unobstructed view of the user’s surroundings without requiring refocusing of the eye to accommodate the display. The Incredible HUD’s purpose is to provide a useful display of telemetry to the user via high-quality helmet-based heads-up display unit.

JouleCycle Team, University of Massachusetts, Lowell
Obesity is recognized as serious public health problem that leads to many illnesses such as diabetes and heart disease. According to CDC, about one-third of U.S. adults and 17% of children and adolescents aged 2—19 years are obese. As getting exercise is an effective way to control obesity, the team proposes to design a gaming system called “JouleCycle” to help people get regular exercises and achieve caloric balance. The gaming system is built upon a human powered bicycle and an Intel Atom development board without using any battery.

Being battery-less, JouleCycle offers a bicycle for a game player to pedal to generate power that helps running the Atom board equipped with customized hardware and software. The team will design control circuits to measure energy generation and consumption, customized BIOS and OS to speed up booting process, and multithreaded user applications to leverage the power steps of the processor and accelerators. The gaming software will quantify and visualize the energy generation by the player and consumption by the system components and different tasks. To make the game interesting and enjoyable, the power generated by the player determines the game’s themes and levels.