Cornell University

Team Summaries - 2015

Intel-Cornell Cup Semifinalists

Atlas 7
Pennsylvania State University

Atlas 7 - Penn StateThis drone will be maneuvered using programmed computer boards. The drone will have recognition, tracking, and delivering capabilities. The drone issues a specific mark, like an LED light or a symbol. When it sees this "mark," it will maneuver its position above that point. Next, the drone will be programmed to air drop materials. This air drop capability is practical for many applications, and can perform various tasks other units cannot. With the advantage of being airborne, drones are able to perform tasks with ease, while land units will have difficulties traversing uneven or muddy landscapes.

The drone will be equipped with a variety of sensors and a camera to collect important data, and survey a location through an aerial view. On your monitor, you will be able to see everything the drone views, plus the current temperature, humidity, air flow, etc. The overall of the objective of technology is convenience and efficiency, but a drone can revolutionize the future of technology.


BreadBot
University of Pennsylvania

Breadbot - University of PennsylvaniaWe seek to design and build an affordable, scalable, modular robotic educational platform to complement an online STEM curriculum. As a “STEM curriculum on legs,” BreadBot bridges the gap between theory and physical application by providing an opportunity for students to learn and experience math and physics concepts in a novel and interactive way.

The BreadBot platform will be a central robot hub, with easily attachable modules to introduce new behaviors, interactions, and concepts relating to a STEM curriculum. The structure of the product line will allow different online STEM courses to be represented as purchasable modules (BreadSticks), allowing students to buy more course packs as necessary.

At its core, BreadBot will be a bipedal legged robot with a caster tail. The robot will have a standardized locking mechanism to allow for easily swappable legs/wheels/caster modules. This platform will have a several sets of various style and stiffness legs. BreadBot will come equipped with an attachable integrated breadboard with set connections to specific pins on the microcontroller for easy integration of external circuits. We will also develop a tablet-based app with wireless abilities through which students can monitor and control the robot.


C.A.R.R. System
Boston University

CARR - Boston UniversityThe number of bicycle accidents on the road increases every year. While bicyclists are expected to take extra precautions, motorists can also help prevent bicycle accidents with the assistance of a bike detection system that helps improve a driver’s road awareness by alerting them to potential crashes before they happen.

The C.A.R.R. System (Cyclist Alert Real-time Response) is an all-in-one alert system which checks both sides and behind your vehicle for approaching cyclists. Using a sophisticated real-time detection algorithm, The C.A.R.R. System instantly notifies you of potential and imminent collisions to assist your maneuvering before a crash occurs.

The system consists of an easy to install mountable alerting system and features an intuitive driver alert interface. The system uses two cameras, one on each side-view mirror, which feed an image-processing algorithm. If a cyclist is positively identified, the alert hub will generate a non-distracting audio and visual feedback on the central dashboard informing the driver of a danger and from which side this potentially dangerous cyclist is approaching.


CERBERUS
Seattle Pacific University

CERBERUS - Seattle Pacific UniversityCERBERUS is an autonomous fire-fighting robot to be stationed in buildings, capable of first-response fire suppression without any human intervention.


ForceField
University of Pennsylvania

ForceField - University of PennsylvaniaOur project, called “ForceField,” aims to replace the computer mouse with a 2.5-dimensional force-feedback haptic device. Engaging the sense of touch in a computer will allow us to add new layers of interaction and immersion to everyday computing, as well as novel applications. A twenty inch diagonal planar workspace in the x and y axes will map the sliding input point to the computer screen’s cursor like a mouse, and an inch of motion in the z axis will allow for a third Cartesian position input to the computer. By adding position sensors and actuators to each axis, we can add any force feedback vector to traditional mouse input. A few of our target applications include: viscous responses to clicking-and-dragging, using force vectors to model three dimensional menus and buttons, the ability to guide a user’s cursor along a set path to assist muscle learning, and interactive games with force feedback. We will use the Intel Galileo 2 for sensor reading, actuator control, and input/output processing. Our goal is to integrate with Microsoft Windows and custom applications.


GrowBox
Boston University

Grow Box - Boston UniversityOur group is determined to make a user friendly, automated hydroponic growing system for people to grow food and have fun doing so. The user will not be required to have any prior knowledge of growing plants, or understanding of what is happening behind the scenes. GrowBox will be where the plant resides and what sees after its needs. Behind the scenes there will be a system of sensors and image processing software to provide data on the plant’s status. An iOS app will act as an interface for this smart appliance, notifying the user when something is required of them, and what is required. GrowBox is intended to bring fresh, home-grown vegetables to those who don’t have the time, space, or experience to grow their own.


GWounD
University of California, Riverside

GWounD - UC RiversideWe will be developing and designing a smoke and CO detector that will be wirelessly connected to a vibrating pillow using Wi-Fi. We will also be developing an Android app that will receive emergency alerts from the Internet. Any time smoke is detected or an emergency alert is sent out, the app will be notified, and will notify the vibrating pillow. This smoke and CO detector is intended to help the hearing impaired receive notifications of fires and emergencies.


HighSkeye
University of Pennsylvania

Team HighSkeye -UPennTeam HighSkeye proposes to design a high altitude platform (>50,000 feet) that is capable of maintaining the same height for extended periods of time.

Our primary objective is altitude control of a high altitude weather balloon. By creating a feedback loop which is preprogrammed to open and close and pressure-controlling valve, we hope to maintain the balloon at a specific altitude until either 1) other factors cause it to fall (e.g., UV degradation of latex) or 2) we specify a time-frame or location upon which the balloon should descend.

Our platform could be used to conduct research experiments more efficiently. Our altitude control of the balloon will help users reach the optimal height for their mission, whether it be photography, high altitude research, or communications. Our platform could serve as a low-cost alternative to the burgeoning cubesat/nanosat market — it would enable the creation of a constellation of small satellites in the upper reaches of the atmosphere rather than in Low Earth Orbit, potentially at two to three orders of magnitude lower cost.


NetLane
University of Massachusetts, Lowell

NetLane - UMass Lowell Traffic congestion is a condition where slow moving vehicles congregate, causing increased travel time and fuel consumption. The average American wastes 40 hours and $818 a year due to traffic congestion. Collectively, the American population will spend around $121 billion annually on fuel consumption due to high density traffic. By observing historical trends in traffic data, it is possible to mitigate the adverse effects of traffic congestion by improving the infrastructure for highways, subways, and other forms of public transportation.

Current systems used for data acquisition can be intrusive and some are prone to error depending on weather. One recent solution is the use of Bluetooth sensors. With the increasing popularity of mobile devices, a chain of Bluetooth sensors can be used to analyze traffic data based on the location of a passenger’s Bluetooth enabled device. However, the amount of people who constantly have Bluetooth enabled on their device is low.

NetLane aims to increase the sample size of data acquisition by using Wi-Fi sensors. Initially, NetLane will target buses to determine the volume of passengers at each stop as well as their origin and destination. Our solution will provide a scalable method of non-intrusive data collection for transportation engineers.


Project PAM
Southern Illinois University

Project PAM - SIUProject PAM (Photoresin Additive Manufacturing) is the world's first open source hardware DLP 3D printer that follows the Open Source Hardware Definition set by the Open Source Hardware Association. This type of 3D printing gets away from the extruding spaghetti machines and instead uses light-curing resins to build your models. This means higher resolution, fewer moving parts, and faster build times.

Photoresin additive manufacturing printers have many advantages over fused deposition manufacturing (FDM) printers. Currently there are DLP 3D printers on the market; however, most have either high cost or small build volumes. Project PAM takes DLP 3D printing in a new direction, providing the largest build area of any hobbyist DLP 3D printer on the market without sacrificing resolution and all of this in a low cost open source design.

The goal for this project is to produce a high resolution DLP printer which is fully open sourced using available or easy to fabricate hardware in a flexible, well documented design. Project PAM has a max build volume of just less than 9 liters. We are able to accomplish this by supporting dual 1080p projectors; however this can be configured to meet a user's needs.


RISE
University of Massachusetts, Lowell

Team RISE - UMass LowellA stroke is a medical emergency and can cause permanent neurological damage or death. In the United States and worldwide it is one of the leading causes of death. Treatment to recover any lost function is termed stroke rehabilitation, ideally in a stroke unit and involving health professions such as speech and language therapy, physical therapy and occupational therapy. If the patient receives these therapies in a timely manner re-occurrence of stroke can be reduced. In our project we aim to develop a twofold computer-aided monitoring and rehabilitation system for stroke patients using low cost wearable computing platform and devices. The objective of the first part of the system is to develop a wearable device which would detect in real-time abnormal events (e.g., fall) for stroke patients. Also this wearable device would act as a daily activity monitor for the patients. The objective of the second part is to develop a human body tracking system to check if the patients are carrying out rehabilitation activities in a correct and timely manner. Finally, all the data generated by this system would be loaded in a network based patient management system to provide physician feedback and generate reports.


Royal Engineers
University of Scranton

RoyalEngineers - ScrantonThe invention is a motion controller generally capable of operating a moveable device in multiple dimensions/axes to track an object. Example devices controlled in proof of concept design include an astronomical telescope and reflective solar tracker array. For the reflective solar tracker, the motion controller utilizes sun astronomical data from the U.S. naval observatory to continuously keep solar panels sun facing maximizing their energy output. Future embodiments of the motion controller include user inputted GPS coordinates to point a device at the desired location.


SCRAM (Supply and Command Rover for Autonomous Multicopter)
University of Central Florida

SCRAM - University of Central FloridaThis project will see to the design and creation of an autonomous rover whose main purpose will be to communicate with, and charge, a semi-autonomous multicopter. The rover will be able to wirelessly issue way-point commands as well as designate preset paths for the copter to follow. These preset paths will be uploaded to the rover from an outside station using long distance communications and then relayed to the multicopter using short distance communications. When the copter’s battery life has dropped to a certain percentage the rover will be notified via radio transmission and a rendezvous location will be set by the rover based on the relative locations of the two vehicles. When the copter and rover come within close proximity to one another, a targeted autonomous landing sequence will commence. Once the copter has safely landed on the rover’s charging pad, the solar energy which has been gathered will proceed to charge the copter’s battery. While charging, the copter will wirelessly transfer the sensor data accumulated during flight to the rover, subsequently freeing up storage space for its next flight. Once the charging and data transfer is completed, copter and rover will separate and the mission will be resumed.


Slate8
Howard University

Slate8 - Howard UniversityEver wonder what it is like to communicate without speaking? You can try using hand gestures or home signs, but the problem is that they are not universal. You need to learn a sign language that everyone understands, but unfortunately there doesn’t exist any. Even a widely accepted language like the American Sign Language (ASL) is barely understood by anyone outside the hearing impaired community. This is a problem all the hearing-impaired people face in their everyday life.

Our Project slate8 would help hearing-impaired people communicate conveniently with people unfamiliar with American Sign Language (ASL), and hence bridge the communication gap. This will also raise their self-confidence as they can better adapt into the broader community. The system will employ Intel atom board with a camera to stream video as input, which is then converted into text/voice using real-time image processing algorithms.


Soft Robotic Hand
Worcester Polytechnic Institute

Soft Robotic Hand - WPI Conventional hard manipulators require a high degree of precision when attempting to grasp an object, and typically are limited to a small range of tasks. Robots capable of high precision manipulation are prohibitively expensive for many applications. Lower-cost robots, such as Rethink Robotics’ Baxter, lack the precision to interact with small or delicate objects.

Our solution to this problem is to design and develop a soft robotic manipulator that can lessen the precision required with robotic grippers today. The design will incorporate fluidic elastic actuators constructed out of silicone rubber, and will have soft, deformable sensors embedded within the actuators for position and force feedback. The feedback systems will allow the force output of each actuator to be controlled individually.

By focusing on safe, reliable grasps with limited precision, our gripper is able to manipulate objects that would otherwise require a much more expensive system. The gripper is projected to be low-cost compared to other hard manipulators. Existing soft robotic grippers, such as the RBO hand and PneuNet technology lack controls necessary to safely interact with objects requiring precise movements.


Team BionUX
University of Pennsylvania

BionUX - University of PennsylvaniaOur team has found the next area of improvement for prostheses that can increase the functionality at a reasonable cost: add temperature, pressure, and orientation feedback to the user.

Two million people live with limb loss in the United States, and hundreds of thousands of United States citizens are added to this statistic each year.1 Creating a prosthesis that functions intuitively for these amputees is imperative for two reasons. First, using a prosthesis that has added functionality will translate to helping the amputee perform day-to-day tasks. Second, succeeding in creating an intuitive prosthesis is imperative because of the impact of phantom limb pain (PLP). PLP is described by aching or other uncomfortable sensations perceived at the site of the amputation. A prosthesis which correlates body signals to its movements may eliminate PLP and thus improve the user’s quality of life.2 Adding feedback will increase this correlation, making the prosthesis more able to benefit the lives of millions. As a result, our team will be developing a low-cost, 3D-printed, myoelectric-controlled, haptic instrumented prosthetic arm for trans-humeral amputees. The system will also include haptic presentation of the tactile data collected, and we plan to make it an open-source project.


Team DORA
University of Pennsylvania

DORA - University of PennsylvaniaDORA is an experience-driven, entertainment-based navigation platform that will allow users to conveniently, economically, and thoroughly explore remote locations. We seek to capture the enthrallment, inspiration, and sense of belonging associated with traveling to lands both familiar and foreign. As such, there will be two main thrusts: (1) the “end-user” experience, equipping the average consumer with the necessary sensory stimulations to feel fully immersed in the environment that he is exploring, and (2) the exploration platform itself, which will ideally be capable of traversing extreme arenas inaccessible to casual visitors, or, by nature of their environment, otherwise preventing visitation.


Team PIES
University of Houston

Team PIES - HoustonThe pipeline fire isolation system for emergencies provides local and remote access and control over emergency shutdown valves inside of a pipeline to isolate a fire. The program will also provide many display readouts for the user to understand exactly what is happening inside of the pipe (i.e. pressure, temperature, flow rate, etc.). There will also be a stand-alone hydrologic system that will implement an emergency shutdown response. This will decrease the time it takes for the valve to close.

Currently in the event of an emergency, a system shuts two threaded valves to suffocate the fire inside of the pipeline. This process is slow, about 3 seconds to completely shut the valve. It wastes a large amount of product, and causes significant amount of property as well as environmental damage.

In 2013, 62 significant instances of pipeline fires occurred causing over 15 million dollars in damage to people’s property alone, 8 deaths, and 40 serious injuries. In the past 20 years, no safety systems have been implemented to make significant reductions in these numbers. We would like to see the number of accidents cut in half.


Team Stratus
University of Colorado Denver

Team Stratus - University of Colorado DenverThe field of computer vision seeks to develop machines with the capabilities to perform tasks that currently can only be performed by humans. Important aspects of human vision such as the ability to identify obstacles, plan paths around them, learn from one’s surroundings, in real time, area crucial to many societal uses of technology. Each year, computer systems running complex vision software collectively step closer to emulating each stage of human vision. However, accurate and efficient computer systems have always been obstacles to integrating greater amounts of automated robotics to help people. To enable next-generation robotics to perform field sensing in dangerous and remote environments, Stratus will deploy a sophisticated real-time Simultaneous Location and Mapping (SLAM) algorithm to provide precise odometry, high-resolution 3D environmental maps, and awareness of specific adverse environments. The Stratus team will investigate the use of 3D sensing and computer vision technology available for mobile platforms and design custom algorithms that trade-off execution time, accuracy, and power efficiency. Overall, the goal is to demonstrate a large-scale near-field mobile 3D mapping system to aid in rapid disaster recovery and environmental analysis scenarios. Specifically, Stratus will aid in surveying Colorado Rocky Mountain rock slides and avalanche areas.


Team WASP
University of Pittsburgh

WASP - University of PittsburghWASP is a small unmanned aerial system (SUAS) able to track and follow multiple GPS transponders. Its purpose is to improve current SUAS systems employed by the military and present a new method for employment in the field. Current systems lack autonomy, have excessive training requirements, and require a special team to be utilized. WASP will address these issues to make the system easier and more effective to use.


ToyBotz
University of Pennsylvania

ToyBotz - University of PennsylvaniaPreterm birth and low birth weight are common factors that put infants at risk for developmental delays and impairments. Early detection of these impairments and subsequent intervention can lead to better health outcomes for these children. Successful detection of at-risk infants depends on the effectiveness of current clinical scales, which are currently not sufficiently sensitive to screen infants younger than six months. We propose to develop a SmarToyGym to detect at- risk infants from 3-11 months by non-invasively monitoring and quantitatively measuring infant body and hand movements in a natural play environment.

The SmarToyGym incorporates sensorized, wireless toys placed strategically in reach of the infant to elicit toy-oriented body, arm, and hand movements in order to assess reach and grasp forces during play. Additionally, a Microsoft Kinect 3D motion capture system will collect video data of infant kinematics and posture as they reach for toys around them. Utilizing these metrics of infant motion and the squeezing, pinching, tilting, and shaking of the sensorized toys will enable the SmarToyGym to accurately differentiate between atypical and typical infants from 3-11 months of age. The SmarToyGym can be used as a low-cost predictive and evaluative tool which will provide ample opportunity for early intervention.


WALRUS Rover
Worcester Polytechnic Institute

Walrus Rover - WPISearch and rescue robots have been in use since the September 11, 2001 attacks on the World Trade Center. They were deployed to search for victims, identify hazardous materials, and provide human rescue crews with a thorough understanding of the dangerous environment. Robots were also deployed after natural disasters including hurricanes Katrina, Ike, and Sandy, and the 2011 tsunami in Japan. These situations presented an additional challenge for rescue teams because of pervasive flooding, which could not be overcome by commercially available land rovers.

Our proposed solution is the Water and Land Remote Unmanned Search Rover (WALRUS Rover), an amphibious rover to aid in the search and discovery of survivors. It will feature shared autonomy, high-definition vision systems, payload capability, two-way communication, and advanced mobility. Such a rover would not only give relief teams more information about the situation at hand, but also eliminate the danger of sending search parties into harsh and high-risk environments. The WALRUS Rover will be capable of overcoming indoor and outdoor obstacles like stairs and rubble as well as navigating in flooded environments, enabling a wider range of mission profiles.


Intel Open Team Summaries

ADRA
Arizona State University

ADRA - Arizona StateWhen the landscape is torn up by natural or man-made disasters, location of victims, deployment of resources such as food, water and medical supplies and establishing communication becomes an enormous challenge. Even if we have the required resources, all of the effort goes in vain if we cannot deliver them to the victims. Therefore, identification and location of victims, at the time of a disaster is of primary importance for any relief worker. Speedy detection allows the relief worker to save precious time, which is critical for preventing and minimizing casualties.

We propose an Airborne Disaster Relief Assistant (ADRA), which will help in locating a victim, provide GPS location to the personnel for quick response and deliver important packages such as first-aid kits, food rations and communication aides to the victim. The ADRA will have optical recognition and heat sensing capabilities for human detection purposes, and the data collected can be transmitted to a base camp for analysis and decision-making. It shall also possess an attitude adjustment system for maneuvering and collision avoidance.


APRo
University of Pennsylvania

APRo - UPenn APRo, an application-based robotic platform geared towards the hacker and hobbyist communities, enables users to develop robotic applications with its multi-purpose hardware. Much like the Arduino catalyzed the micro-controller market and smart phones pioneered mobile applications; our framework is designed to springboard robotic applications. We aim to achieve the following objectives:

  • Combine commonly used sensors and multi-media hardware into a single package
  • Provide an easy to use interface (API) for accessing the underlying hardware
  • Create examples to demonstrate integrated functionality (such as telepresence)
  • Release open source software and hardware under the GPL license
  • Develop an aesthetically pleasing, robust, and compact mechanical platform
  • Mold the device into an affordable, mass-manufacturable, and marketable product

Auto Safe
Oregon State University

Auto Safe - Oregon StateOur project entails the creation of a wireless network that communicates between vehicles and an interface within the vehicle to create an overall safer driving experience. Our challenge is to make drivers more aware of their surroundings, while also reducing distractions. This challenge can be overcome by creating a family of embedded devices that can communicate with one another in order to notify drivers of potential hazardous situations that the devices and algorithms detect. This will make roads a safer place by giving drivers real-time information updates from other vehicles in the network. These warnings and messages will be displayed on an interface inside the car.

Vehicle interfaces are becoming increasingly complex, but this complexity has moved the driving experience away from the road and has created more distractions for the driver. Our project is to enhance the driver experience by making a simple interface which is concentrated in one region to reduce distractions and to create a safer driving experience. We will do this by creating an interface which allows the driver to focus on the road while being able to control all critical features of the car.


Bright Communications
University of Houston

Bright Communications - HoustonInternet consumption is exponentially growing with the rise of video streaming and the “Internet of Everything” (IoE). The growing demand of bandwidth such as Ultra HD video is quickly outpacing improvements in wireless technology. Furthermore, high density areas such as concerts and auditoriums still pose a challenge to wireless network engineers to provide high-speed yet reliable coverage. A combination of interference, duplex, and security impact today’s wireless networks.

According to Harald Hass from TED Talks, Visible Light Communication (VLC) provides reliable data transfer that is ten times faster than today’s 802.11n standard. Bright Communications (Li-Fi) aims to function alongside normal Wi-Fi communication. This helps load balance users between the two technologies allowing for seamless access to all users. Bright Communications will allow for reliable high-speed transfers by using your everyday light source, LEDs. Unlike Wi-Fi, Bright Communications is a low power yet efficient solution to the wireless problem by providing full duplex communication. Additionally this solution allows for scalability in not only the number of users but also for various upcoming technologies including vehicle to vehicle communication and improved inflight networking.


Elephant Tracking Device
University of Akron

Elephant Tracking Device - University of AkronElephant endangerment is a continuing problem around the globe with the World Wildlife Federation listing species of elephants as three vulnerable, five endangered and one critically endangered. Current conservation methods include tracking systems installed as a collar worn by the elephant. Researchers have also made advancements in the understanding of the infrasonic communication used by elephants across long distances. The current collar design has a finite lifespan and requires sedation of the elephant to install and repair. Employing wireless power charging through magnetic resonant coupling increases the amount of power available on the collar while also extending the lifespan of the device, allowing higher resolution tracking and the introduction of an infrasonic audio system. The goal of the proposed elephant monitoring device is to provide real-time access to infrasonic elephant communication and location while extending the lifetime of the collar. Acquisition of infrasonic sound data from the vicinity of the wild elephant in conjunction with the elephant’s location will allow researchers to study this unique communication method simultaneously with elephant behavior in their natural environment.  Increased field life of the device also requires less maintenance and reduces human interference with the animal.


Extreme Sports UAV
Purdue University

Extreme Sports UAV - PurdueExtreme sport activities are physically challenging and visually stunning. Because of their inherent danger and physical requirements, there is not a cost effective way to fully video record the activities for individual extreme sport enthusiasts. Current ways of recording extreme sport activities include follow-up helicopters, camera-equipped safety helmets, fixed point cameras, smartphones, or in some cases, paired-up camerists. These video record methods all have their limitations in terms of cost, safety, or video quality. These limitations affect every extreme sport enthusiast.

Our team proposes to design a UAV that will follow a rock-climber and video record his or her activities. The user is assumed to carry an Android device and wear recognizable clothing. A commercial UAV, including the controlling module, will be purchased to serve as the intelligent carrier of the video recorder. We will use an Atom processor to interact with a flight control module to achieve auto following and recording. An Android device will be utilized to send commands to the UAV and switch between manual and auto mode.


Funktioneers
Howard University

Funktioneers - Howard UniversityIn this project we plan to build an electronic voice lock that uses the GSM cellphone network. You will be able to use your phone to call your lock and, using voice encryption as a layer of security, unlock your door from anywhere in the gsm network.


GesturTech: Enhancing Professional Presentations using Technology
University of Houston

GesturTech - HoustonFluid progression of professional oral presentations is often interrupted by hand-bound controllers needing operation. Constantly, presenters are required to manipulate a computer to progress through a presentation. This is done by using a hand-bound input device such as a mouse or keyboard. As of recently presenting has become more convenient thanks to remote clickers and pointers. However, this method still requires the presenter’s hand to be bound to the input device adding another form of inconvenience. Experts at SOAP Presentations out of Sao Paulo, Brazil have conducted research on personal presentation mistakes. Their research has shown that amongst the most common mistakes made by individuals is their body language and limitations of movement.

To deliver this functionality a combination of speed and positioning sensors will allow the user to communication his/her gesture to any remote controlled device. The technology will be wearable and self-powered. With said wearable user interface and wireless communication, the user will be able to use their arm virtually as an input device for presentation navigation. This device targets elimination of disruption caused by having to “pause-and-click” through a presentation. Future considerations for the applications of this device range from PC control to drone navigation.


Haptech
University of Rochester

Haptech - University of RochesterHaptech is a multisystem computational and hardware platform for environmental virtual reality, capable of tracking a user’s movements, translating them into a virtually constructed space, and providing full body haptic feedback when the user comes into contact with a virtual object. In layman’s terms, this allows a user to “feel” a virtual object simulated in front of them via a computer program, in conjunction with visual feedback with a VR headset such as the Oculus Rift. The project is currently capable of performing this task using vibrational feedback on the hands within a confined space, using infrared camera tracking to interpret the user’s location. Our future goals for the project include a hybridized system for tracking that fuses 9 axis sensors with camera motion capture, multi-camera tracking for large and flexible environments, and more sophisticated methods of force feedback for user interactions.


Head Turner
University of Pennsylvania

Team HeadTurner - University of PennsylvaniaOur project aims to increase the safety standard of bike helmets. To accomplish this, we propose a two-step solution. First, we will optimize visibility by incorporating turn signals, brake lights, and a headlight into each helmet. Not only will this help cyclists, but will also help cars. We will use gyroscopes, accelerometers, and magnetometer, incorporated into a modified processor. Additionally, we plan to redesign the helmet to increase the damping coefficient with the goal of reducing the number of concussions in light to medium impact situations. We will still maintain airflow and lightweight qualities typically associated with bike helmets.


HSMRS
Worcester Polytechnic Institute

HSMRS - WPIOur team will be developing and implementing a framework for Human Supervision of Multi-Robot Systems (HSMRS). In many applications it is beneficial to utilize a team of different robots to accomplish coordinated tasks. Some applications, however, involve performing tasks that may either be too complicated or high-risk for a fully autonomous system, necessitating a need for a human supervisor. This project will enable teams of heterogeneous robots to be used for various applications while taking advantage of human supervisors to aid in tasks which the robots cannot or should not perform on their own. With the wide range of potential applications for such a framework, this project has many stakeholders and, by extension, many requirements. We will be using a Systems Engineering approach to ensure that our project meets the requirements of our stakeholders, performs well according to our key performance parameters, and is well tested against relevant scenarios found in industry and research.


LoWiFi
University of California, Davis

LoWiFi - UC DavisWhat we’re offering is a cheap, simple radar detection system that will be able to detect, track, and relay the location to you. This would eliminate all effort of searching and wasting time. Naturally, the application could also be applied to keeping track of your children around a store in case you get separated, or even tracking the location of your parked car in a parking lot.

We would build affordable and easy-to-use interfaces for a transmitter to communicate with an antenna at the missing item. Using the information the main detector receives from the attachment, the detector will be able to locate the location, the distance, and whether it was moving at the time of query (in case you’re tracking a child).

The entire system will consist of a stationary part that will be constantly sending high frequency electromagnetic waves, searching for the tracker. The tracker will be sending waves in response to a search signal and once detected, the system will be able to recognize  whether the tracker moved, and at what speed it is going.


Mechanek
University of Pennsylvania

Mechanek - University of PennsylvaniaIn the automotive racing industry, the most commonly used head restraint system is the HANS device. This device was developed in the early 1980s with the goal of decreasing the risk of basilar skull fracture during automotive collisions. It uses two rigid straps to constrain head motion during a crash. Drivers using this restraint system commonly complain of feeling restricted by the device, and while the risk of basilar skull fracture is much lower, concussion and other mild traumatic brain injury are still common.

Our team will design, build and validate a head and neck support system for automotive racing that uses active dampers for energy dissipation. The product will improve on current options by decreasing the rates of traumatic brain injury during collision, by increasing driver visibility and mobility, and by lowering the chance of peripheral injury such as collarbone breakage. An active damper system will allow for low speed movement to be unaffected, but precise and tunable control over head deceleration during a crash event.


Ora: Energy Solutions
Oregon State University

Ora Energy Solutions - Oregon StateIn a world where our resource and energy consumption are rapidly outpacing our energy production, many conscientious consumers are turning to technology to provide smarter solutions for how to reduce their own environmental impact. Despite growing research into greener power, alternative fuels, transportation, and smarter grids, often times these solutions are out of touch or scope with the less technically inclined consumer, leaving them at a quandary as to how to actually put their good intentions and others' technological advancements into use in their everyday domestic lives.

Partial and attempted solutions exist, but as of yet, no truly simple, easy to understand product exists to enable someone to effectively monitor and control the power consumption in their home. Our solution, Ora, would provide the average household user with convenient, easy to access information about their daily power consumption right on their smartphone, with simple installation modular units allowing them control over household appliances, electronics, and lights right in the palm of their hand. By providing the average homeowner with real time, day-to-day power information, and control over how they use it, our solution will help reduce their power bill and their overall energy consumption for a greener tomorrow.


PARbot2
Worcester Polytechnic Institute

PARbot2 - WPIThe purpose of this project is to develop a Personal Assistive Robot that is able to aid in the care of the elderly and those suffering from age-related disabilities. This year, a group of five students are working on evolving the previous year’s Major Qualifying Project (PARbot1). With the help of the Psychology Department, the nine students will determine the current needs of those with age-related disabilities that this year’s Personal Assistive Robot (PARbot2) can fulfill, as well as what can make for easy user-robot interaction. The aim for PARbot2 is to create a robot that has assistive and companion abilities. One of the main objectives of PARbot2 is to add a robotic arm to the Personal Assistive Companion Robot to aid with mobility issues and to make it user friendly.


ReadSense
Georgia Institute of Technology

ReadSense - Georgia TechThe Problem: Finding a way to simplify the way visually impaired and disabled people read.


Smart Wheelchair
Worcester Polytechnic Institute

Smart Wheelchair - WPIThis project improves the intuitive interfaces and control design for an existing Smart Wheelchair targeted for persons with Locked-In Syndrome (LIS). The semi-autonomous wheelchair developed by the Robotics and Intelligent Vehicles Research Laboratory (RIVeR Lab) at Worcester Polytechnic Institute performs assistive navigation control through obstacle avoidance and wall following. The project work includes improving existing systems and adding new capabilities. The assistive navigation control will be improved by enhancing the system’s responsiveness, refining the Emotiv headset navigation interface, and adding an innovative EMG sensor for navigation using muscular signals. Additionally, to enable users to engage their environment through tasks such as self-feeding, a JACO robotic arm and Kinect sensor will be integrated. The overall goal of this project is to increase the user’s self-sufficiency.


Team Arete
Johns Hopkins University

Team Arete - Johns HopkinsTeam Arete proposes to build an EEG-based fear-recognition system. EEG has served as a non-invasive surveying tool as well as a powerful diagnostic tool in the medical field with which the investigators can check the activity and the functionality of neurons in the brain. However, owing to its low spatial resolution and accessibility, few studies have been done in identifying human emotion using this method. In this project, we intend to develop a system that can recognize human emotion (specifically fear) based on a series of event-related potential experiments using a16-channel EEG setup. We will overlay known neuroanatomical data and target specific neurons in suspected brain regions (prefrontal cortex, amygdala, C1 response of primary visual cortex) to collect EEG signals. Signals collected from a wearable EEG detector during the presentation of fear-evoking stimuli will be isolated, processed and wirelessly transmitted to the experimenter’s electronic devices for further analysis. A potential application of this project is the possibility of an improved computer-user interaction, which allows computers to provide varying responses based on one’s fear level with further applications in the field of medicine to better understand neurological disorders that are strongly associated with fear response, such as PTSD.


Team Carter School
Boston University

Team Carter School - Boston UniversityThe Carter School promotes education and individual growth to students in age from 12 to 22 with severe physical and mental disabilities. Our goal as team Carter is to create a personalized greeting to each student as they enter a specific classroom. We hope that this system will help the students make associations between cause and effect to further develop their cognitive skills and increase their contentment at school.


Team H2O
University of Houston

Team H2O - Houston Sustainability projects have become common place in public facilities across the U.S.; however many of these lack proper service monitoring systems. Manually monitoring these systems is expensive, time consuming, and often results in a reactive response which can result in downtime for the system. Currently, University of Houston employs four full-time employees working 40-hours per week for the maintenance of EZH2O bottle filling stations. Wireless sustainability efforts can reduce the labor cost of maintenance by 50% creating a weekly savings of $4000. By implementing self-monitoring systems, the University is able to take advantage of resources in a more effective manner.

Our wireless sustainability information and monitoring system will monitor the EZH2O units filter status and number of bottles saved, and create a report to a webserver. The initial scope for this project lies with the University of Houston. The university currently has 68 bottle filling stations spread across the 667 acre campus. This size and space between each station causes a drastic increase in the cost to maintain and service these machines. The wireless sustainability information and monitoring system will generate significant savings in labor and time-required for maintenance.


Team MuMu - Mobile Device Locking System
University of Houston

Team MuMu - HoustonAccording to an online source, SafeguardTheWorld, there are 2 million home burglaries in the US reported every year. Our product is aimed at protecting your homes from burglary using your mobile device. A Pew Research study found, “58% of American adults have a smartphone” and “42% of American adults own a tablet computer.” The purpose of this project is to develop a more convenient and safer method of accessing and protecting your homes using a smart device. Such a system will allow other features to be included examples include allowing the user to unlock their doors from anywhere in the world and automatically informing the user of any activity at the door.

There is a PIN code required for the application preventing others from accessing it using a stolen Smartphone. This would be convenient for points of access needing to be shared by numerous people, such as entry into gated communities. Additional functions could enable emergency personnel special access.


Team SSLN/Breaze
University of Pennsylvania

SSLN Breaze - University of PennsylvaniaBreaze is a portable, autonomous solution to oxygen tank transport for the purpose of supplemental oxygen therapy. A variety of ailments require continuous oxygen supply such as chronic obstructive pulmonary disease, late-stage heart failure, cystic fibrosis, and pneumonia. As elderly patients travel within the hospital, they often require nursing assistance to carry their oxygen tanks which discourages them from maintaining their physical therapy.

Breaze is a robotic retrofit for oxygen tanks which can follow patients around in a hospital setting, eliminating the need for manual transport or nursing assistance. The device is powered by DC motors and can change directions with the patient, as well as avoid obstacles in its path. By implementing robotic control and path-planning algorithms, Breaze will track patients and maintain a proper distance, making mobility more feasible and less of a burden. To meet the demands of an aging population, Breaze increases the quality of life for patients by promoting mobility and incentivizing consistent oxygen therapy.


Team Vitastats
Boston University

Vitastats - Boston UniversityWe are creating a remote health monitor as a proof of concept. It will use microphones to record heart and lung sounds. Real-time analyses will be carried out to locate benign and pathological heart sounds with an on-board processor. A diagnosis will be made and transmitted through Bluetooth to a smartphone for viewing waveforms and placing a call for help if needed.


Team WCNN
University of Massachusetts, Amherst

Team WCCN - UMass AmherstOur project is a wireless network of battery powered wildlife cameras. In our implementation, each node would capture images of wildlife, capture environmental data such as temperature and barometric pressure from sensors, and propagate data from other nodes to the server. The server would host the images and environmental data on a website. The system would be designed to be low power so that the batteries would not need to be changed frequently.

The advantages of this system over existing implementations are that it does not need a person to collect memory cards from the cameras, which can interfere with wildlife. It also does not rely on cell phone service, which is not available in many areas that are studying wildlife.


Team Wheelchair Visionaries
University of Pennsylvania

Wheelchair Visionaries - University of PennsylvaniaWe plan on designing and building an adapter kit that converts a manual wheelchair to a motorized wheelchair. Additionally, this kit will utilize eye tracking technology to control the wheelchair that we will also develop. The kit will allow for the same range of motion as a standard motorized wheelchair. This kit will be portable and easy to install and remove.


The Achievers
Arizona State University

The Achievers - Arizona State UniversityThe intended purpose of our project is to make an interactive system that can play games with a child. Specifically, we are focusing on developing a system that will allow a user to play soccer or fetch with a robot. We will communicate with our robot using voice commands. To initialize the activities we want we will use commands such as, “Cosmo, let’s play soccer” or “Cosmo, let’s play fetch”. Our goal is to be able to have the robot find and identify a stationary ball or intercept a moving ball to stop and kick or carry it back to the user – depending on the game being played. The robot is based on a pre-existing open source API platform. Some of the tools that will be used in the creation of this robot are as follows: Microsoft Kinect, OpenNI (for voice commands and vision algorithms), a variety of sensors (electrode and ultrasonic), Intel Atom (for processing high level vision and voice commands), a BeagleBone (for embedded system controls), and actuators (for closing arms and kicking the ball).


Tufts Robotics
Tufts University

Tufts Robotics - Tufts UniversityThe Tufts Robotics Club plans to use an Intel Galileo in the construction of a universal gripper arm. The universal gripper, a novel gripper technology developed by Cornell University, utilizes the jamming of fine material to pick up objects. A balloon filled with fine material is pressed onto an object. Air is sucked out of the balloon causing the material inside to constrict around the object, allowing it to be manipulated. Here is a video: http://www.youtube.com/watch?v=0d4f8fEysf8

The arm portion consists of two joints and a base able to rotate 360 degrees in order to achieve full range of motion. A challenge in building the arm is accounting for its weight. The arm will use timing belts attached to servos located at the base to generate a very accurate and quick moving arm. The timing belts will also offload the weight of the motors from the arm which will add more lifting capacity. The arm lengths will be constructed from PVC to make the arm even more lightweight.

The arm has applications in mobile robots tasked with manipulating their environment. Additionally, this type of arm mechanism is extremely flexible in that it is able to pick up virtually any kind of object.


UH2O Smart Water Fountain
University of Houston

UH2O Smart Water Fountain - HoustonSince the cost of water has increased by more than 60% in Houston since 2000, it is in our best interest to begin implementing economical measures, such as ways to reduce water waste. Traditional drinking fountains produce a considerable amount of waste water, and bottle filling stations still rely on user input to dispense the correct amount. In addition, the filters may need replacement at any time, maintenance crews need to check the filter status of each fountain multiple times a week, adding to their overall cost.

The improvements we plan to make to the refill station include a WiFi shield that will be used for real-time wireless communication, allowing remote monitoring by sending information such as the status of the filter and the bottle count to headquarters. This will also eliminate the need to manually monitor each station thereby reducing labor costs. Another improvement will be to install sensors used to measure the volume of a container to dispense the appropriate amount of water without overflowing or constant monitoring. With these improvements, there is a potential of at least 511 tons of greenhouse gas reduced in one year from 63 refill stations.