Touch me display

Project Overview

 

The purpose of the Touch Me project was to create a touch screen that could tell visitors about the Skunk Works® Innovation Gymnasium using two existing glass panels. Students on the project were Jeff Artigues (freshman, CSE), Andrew Blanchard (freshman, EE), Ian Cowley (freshman, CSE), Aaron Evans (sophomore, CSE), and Elise McDonald (junior, EMIS). Project advisors were Dr. Nathan Huntoon and Nick Vrana.

 

The project easily broke down into three main work projects: hardware, application, and software. Hardware sought to create the physical touch screen. The application is the component projected onto the screen for users to interact with. The software interprets and transmits the touches on the screen so that the application can respond appropriately.

Resources and Constraints

 

The team’s primary resources were student and project advisor talent and experience, computers, Innovation Gym tools and supplies, the Mechanical Engineering machine shop, and a $2300 budget. These resources all provided a very reasonable probability of success.

 

The biggest project constraint was time, however the team worked quickly and sought to achieve milestones daily. If there was a delay, it was usually because we were waiting for a product to arrive. The projector’s projection area created some overscan areas, however this created the opportunity to create two touch screen windows rather than just one. The size of the lower window dictated the size of the app’s display area. The height of the average user dictated which portions of the screen could be interactive and thus where buttons could be placed in the app.

How It Works

 

The touch screen uses frustrated total internal reflection (FTIR) that a user creates when they touch the screen and dissipate non-visible infrared (IR) light. First, a projector projects the app onto the window. The user touches a sheet of vellum, coated on the front with a clear lacquer and on the back with a thin silicon layer to enhance the dissipated IR light. IR LEDs surround the piece of acrylic just behind the vellum sheet. When the user presses on the vellum and acrylic, the IR light is dissipated and this is detected by a Playstation camera. The software, Community Core Vision (CCV), detects the IR light dissipation as a “blob” and communicates the blob’s location to the app. Based on the blob’s location, the app responds appropriately (navigating to a new page, opening a picture, etc.).

 

The projector is an Optima GT700 and projects a 1280×800 pixel image in mirror so that it can be read from outside the window. The Playstation 3 Eye camera has a visible light filter and no IR filter.

Setup of touch screen system

Timeline and Daily Milestones/ Accomplishments

 

The project officially began Friday evening, March 9th, and the team worked approximately 9:30am-11pm daily Saturday, March 10th to Saturday, March 17th. The students willingly gave up their Spring Break to complete the project and received no compensation or academic credit for their efforts.

 

The project began with a kickoff meeting on Wednesday, March 7, 2012. After understanding the objectives and general scope of the project, team members began researching application development, FTIR, and existing touch screen devices.

 

On Thursday evening the team met again, this time fleshing out details of application content and who would be responsible for which portions of the project.

 

The team began Friday night by installing necessary software on their laptop to develop the app, most notably the Qt development environment and touch libraries. We also began to consider the app’s look-and-feel. We began taking window measurements and considered how to install the final touch screen. Projection paper options including 3M projection paper, vellum, and butcher paper were tested for projected image quality.

 

Saturday was a full day of app development and testing hardware options. Basic app functionalities were coded by creating all pages and buttons. The software team continued struggling to install necessary software. Ultimately, they found the open-source Community Core Vision (CCV) software that detects IR light as blobs and decided to use it rather than writing their own software. Hardware built a test frame for an 8.5×11” acrylic sheet, removed the IR filter from an existing webcam, and tested blob detection with the three projection paper options (3M projection, vellum, and butcher paper).

 

On Sunday, the app continued rapid development with addition of test pictures to buttons and pulling information in from text files. A Dropbox folder was created to store files for the app to pull from. We decided that rather than writing a new software to translate the blobs into (x,y) coordinates the app could use to interact with the user, we would use the CCV software. Hardware continued testing projection and blob detection techniques, primarily by adding silicon to the backs of each paper type.

 

App development continued on Monday. Development of the app’s look-and-feel also continued. Pictures and past project descriptions were pulled or written from Dr. Huntoon’s archives and added to the Dropbox folder. All necessary page content was written as well. The software was set with CCV program so these team members joined in on app development. Hardware created and painted a hanging mount for the laptop, projector, and camera and took the acrylic sheets to the machine shop to be cut.

 

On Tuesday app development continued. Necessary background and button images were created so they could be loaded into the app on Wednesday. The hardware team picked up the cut acrylic sheets from the machine shop and polished the edges. A thin silicon layer was painted onto one side of the vellum sheets so they could be installed Wednesday. The team also decided to create two neighboring touch screen windows rather than just one because of the projector’s projection region and the functionalities that could be added with a second screen.

 

Wednesday consisted primarily of detail work. Adjustments to app look-and-feel were made, particularly with backgrounds, button sizes, and fonts. The scrolling pictures were also added to the home page. The hardware team sprayed a clear coat on the vellum sheets, installed the IR LEDs to the edge of the acrylic, and installed one of the two touch screen panels. After that, the team successfully tested the blob detection on the installed touch screen panel.

 

On Thursday, app development continued. The hardware team installed the LEDs on the second acrylic pane and installed this to complete the second touch screen window. The hardware team also created a mounting bracket for the projector and camera.

 

On Friday, the team made a detailed to-do list to structure the day’s activities. The hardware team re-installed the camera to make it more secure and painted the projector and camera mount. We successfully calibrated the touch screen with the camera and were able to play with the touch functionality. After an initial projection onto the touch screen, the team decided to switch to a lighter blue and white color scheme because the black and red was too hard to read. The software team finished coding and debugged the app. Finally, the entire system was successfully implemented and tested on the touch screen.

Hardware

 

Camera and Projector Mount

 

We initially thought that creating a mechanism to hang the projector, camera, and laptop would not be very difficult. It ended up being challenging because the projector and camera had to be at just the right height, angle, and distance from the screen.

 

We started by creating C-brackets to hang from the lip of the dropdown ceiling using 2x4s and metal L brackets. The top part of the C-brackets are braced on top of the drop-down ceiling to keep them flush against the bottom portion of the drop-down ceiling. We then made a platform for the laptop to sit on using ½” plywood board and all-thread. We suspended this from the C-brackets. We initially wanted to hang the projector upside down on the bottom side of the laptop board, however this turned out to be too far away. To overcome the distance, we purchased a projector mount which allowed us to move the projector 10” forward and 42” down. We experienced difficulties installing the projector mount due to its poor design and manufacturing and the need to create a new extension bracket to hang the projector lower. The camera needed to be slightly farther from the screen than the projector, so we used PVC pipe and sheet metal to create a second support from the projector extension.

 

When calibrating the projector and camera, additional adjustments were needed for the suspension mechanism to lower the angle of the projector. We ended up drilling one hole in the ceiling to support the front lip of the C-bracket to achieve this objective.

 

Projection Screen

 

The second major component of hardware was the projection screen. We experimented with three types of materials:

  • 3M Projection Paper – This is what we initially thought we would use based on projection quality. However, it did not allow for adequate touch sensitivity.
  • Glass Tracing Paper – This option had low projection quality so we ruled it out early in experimentation. We did not test it with silicon.
  • Vellum Drawing Paper – This is what we finally ended up using because it has good projection quality and with a thin silicon coat, has excellent touch sensitivity.

 

Project Screen Installation

 

We read online that a silicon layer between the projection material and the acrylic would enhance blob detection. We first mixed a silicon replacement, Lexel, with paint thinner, however it produced a very tacky substance that cause the paper to stick to the acrylic even after the touch had been released. This was bad for blob detection. When we moved to a silicon-paint thinner mixture, blob detection was excellent and the tackiness problem was resolved.

 

We sprayed the front of the vellum with clear coat spray paint to decrease the absorption of finger oils into the paper.

 

Acrylic and LEDs

 

Because glass diffuses infrared light, we used acrylic to create the touch screen. We cut the sheets to the appropriate size for the window and sanded the edges smooth. We then used electrical tape to affix strips of IR LEDs to the perimeter of each acrylic sheet. Placing the LEDs directly on the edge of the acrylic helped direct the IR light into the sheet. This was desirable so that we could get better scattering and better blob detection.

 

The LEDs are powered by a 12 volt, 7.5 Amp power supply.

Application

 

The entire application was written in C++ using the Qt developer environment. The main application on the left window has information regarding the Innovation Gym, Innovation Gym projects, the Lyle School of Engineering, and how to get involved in the Innovation Gym. The app on the right window has generally helpful information such as the time and weather forecast and also includes a space to spotlight student projects.

 

Pages

 

 

The most dynamic part of the app is the projects pages. Past Projects are primarily IDEs, however this section can include any Innovation Gym project. All Past Projects are displayed on a template project page that pulls description and student information from text files, pictures, and videos from the Dropbox folder.

 

The Students Projects information on the right home page will feature senior design or individual student projects. Students can request to have their project featured by emailing a 500-word description and image to innovationgym@smu.edu.

 

Other parts of the app are relatively static and the information will only need to be updated occasionally. This can be done in the respective text files in the Dropbox folder.

 

Code Used

 

16 classes were created, two of which are code from the internet and one was completely unused. Each page has its own class since the layout is different on each page. The 14 classes that were used involved different layouts but all usually included private variables for button icons and button functions. Some classes have private objects that store text file information pulled from the Dropbox folder.

 

FTP Getter – This is the class that was created but failed to pull information from the genuse1.lyle.smu.edu server. It has functions to pull a file, set the text from a pulled file in a private variable, and a function to return the information contained in the variable. It also included a function that checks for a signal and calls a special function known as a slot that is a normal C++ function but can be connected to the same signal.

 

The following pages are all created with a Qt string that contains the base address of the folder inside Dropbox where all of the application’s files are located (testfolder/screen).

 

  • About Page
  • Buildings Page
  • Creators Page
  • Current Projects Page
  • Departments Page
  • Lyle Page
  • Project Page
  • Student Talent Needed Page

 

It pulls the text file that contains a library of the addresses necessary to find the needed information. They also include a QTool button known as a back button which technically closes the current page and takes the user back to the previous page they were on. It also includes a picture button and icon for the button’s functionality. Inside the classes the object QTextBrower stores the information that is displayed on the screen.

 

The difference between these two classes and the 8 above is that they contain QTextBrowser pointers.

 

  • Past Project Template
  • Future Projects Page

 

These two pages update autonomously based on the project being viewed using multiple functions within the class.

 

 

Analog Clock – The analog clock on the right hand app is an example clock from the Qt website. It is drawn with the hour hand and minute hand and repaints itself every second.

 

Weather App – The weather app is copyright to the Nokia Corporation. It automatically updates every 15 minutes and we added a refresh button so users can interact with the weather widget. The code pulls weather information such as temperatures and description pictures from a weather website. It has functions that changes the size of the window the app displays in and to auto-correct itself to the size window available.

 

Linked List – The linked list class contains notes created from two linked list pointers named next and prev. They point to each other to prevent a memory link because you will always have access through a Twitter display. This linked list contains a String for the user name and the tweet, which are distributed to the nodes through the Twitter Display class.

 

Time Stamp – Time Stamp is an object containing the 6 integer variables year, day, month, hour, minute, and seconds.

 

Twitter Display – Twitter Display has a linked list pointer head which points to the very first node. Through that created node, it stores the other nodes and distributes the information taken from the Twitter website through the digest function. The nodes were auto-sorted by time stamp as they were inserted into the linked list.

 

Linked List and Twitter Display Relationship Diagram

 

Clickable Label – A class of super-class QLabel. It takes all the functionality of QLabel and overrides the method MousePressEvent() and has it emit a signal that the mouse has been pressed. When pressed, Clickable Label opens a new window displaying the picture clicked in full screen.

 

Input Data Files

 

We first attempted to use a File Transfer Protocol (FTP) to pull information from, but we were unable to implement this because we were to unable to connect to genuse1.lyle.smu.edu server, likely because it is a secure server, and because supposedly downloaded files could not be found. We then decided to use a Dropbox folder because it is cheap, can be easily updated remotely from another computer, does not have server access issues, and downloaded files were found to be added to the app.

 

The Dropbox file structure follows the page structure to make text, picture, and video files easy to find.

 

Look-and-feel

 

We initially decided on a black, red, grey, and white color scheme because it is eye catching and looked good when projected onto the touch screen. It also loosely follows the color scheme already used on the Lyle Engineering website. When we finally projected the app onto the projection screen, there was not enough contrast and it was too difficult to read. We moved to a lighter blue and white color scheme.

 

To place objects in their proper locations, the app was laid out as a table. The main page title and background colors on each page are one JPEG image, created in Adobe InDesign CS5 and Adobe Photoshop CS5. The buttons are also GIF images created in Adobe InDesign CS5 and Adobe Photoshop. All images can be found in the Dropbox folders.

Software

 

The software allows users to interact with the application displayed on the touch screen.

 

We first attempted to code the touch screen in Linux using the OpenCV and TouchLib libraries. We had issues calling OpenCV, meaning we couldn’t use TouchLib because it was dependent on the library.

 

After an extensive internet search, we found Community Core Vision (CCV), an open-source, cross-platform solution for machine sensing, owned by the NUI group. Using CCV also meant we had to switch from Linux and the Panda Board to Windows and a donated Windows-based laptop.

 

CCV detects the IR “blobs” created by a user’s finger dissipating the light and outputs the coordinates of the blob. Those coordinates are treated as a mouse, thus making the touch screen a giant computer screen. This allows users to interact with and navigate around the app.

 

Calibrating the projector and CCV proved to be difficult because overscan meant that not all calibration points were on the touch screen. This problem was solved by manually inputting location data into the .xml calibration file for the software.

 

User Manual for Updating Content

Add a new Past Project

 

  1. In the Dropbox folder testFolder, go to screen\Projects\Past_Projects.
  2. Right click à New à Folder. The folder title should be the project title.
  3. Create a .txt file with the project description and save it as “description.txt” in the project folder you just created.
  4. Create a .txt file with the students’ names, major, and year. Save it as “students.txt” in the project folder.
  5. Add pictures to the project’s folder. Save pictures as P1.jpg, etc. Create thumbnails for each picture and put them in a folder titled “LowRest”.

 

Format Future Projects

 

To add a future project, go to testFolder\screen\Projects\Future_Projects in the Dropbox folder. Open the project .txt file that you want to update. The first line is the project title. On a new line, type a short description of the project. If there are no future projects to list, leave all future project .txt files blank. You can only have 4 future projects listed at any given time.

 

Format for Current Projects

 

To update the current projects, go to testFolder\screen\Projects\Current_Projects in the Dropbox folder. Open the project .txt file that you want to update. The first line is the project title. On a new line, type a short description of the project. If there are no future projects to list, leave all future project .txt files blank. You can only have 2 current projects listed at any given time.

 

Adding Student Projects

 

To create a new student project, type a short description and save it as a .txt file in the Dropbox folder in testFolder\screen\Projects\Student_Projects. The file name should be the project title. Upload a picture with the same file name. You can have any number of projects showcased at a given time.

 

Adding Talent Needed

 

To add involvement opportunities or talent needed, go to testFolder\screen\Talent_Needed. Open the .txt file. Enter the talent needs according to the following format:

Major

Skills needed

Major

Skills needed

(etc.)

There should only be one piece of information per line. There can be up to 5 postings at any given time.

 

Outcomes

 

The project successfully created a touch screen for students and prospective families to learn about the Innovation Gym in an engaging manner. The system is eye-catching and provides a window into the past, present, and future work in the Innovation Gym. It will hopefully spark interest in the Innovation Gym among current SMU students and will provide a great talking point when recruiting prospective engineering students and families. The hardware, software, and application components were all successfully integrated to create a system that users can interact with.

 

Along the way, team members learned many valuable skills not explicitly stated in the project objectives. Technically, the team learned how to solder, how to use pointers, the value of a C++ destructor, how to use a caliper, not to leave tools (hammers) on ladders, how to use a Linux terminal in greater depth, and FTIR. Non-technically, the team learned how to be flexible when plans go awry, how much work can be accomplished when you focus on one task for an extended period of time, how to apply their skills, and how to jump into tasks to contribute to the project’s success. Most importantly, the team took a project through all concept, design, build, and testing phases of the engineering process. This hands-on experience taught each of us more about engineering than a classroom or traditional lab experience ever could.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>