Showing posts with label red hen spectra. Show all posts
Showing posts with label red hen spectra. Show all posts

Saturday, May 12

Recap this semester at Cooper Union...

I have never been so busy in the past week as I have been this week. A number of terrific things have been happening, and I am so proud of the accomplishments of my students for their entire effort. Please, let me explain.

Cooper Union has a wonderful history of innovation and entrepreneurship in the areas of art, architecture and engineering. This week, the school celebrated the groundbreaking of its new Engineering Building that is to be completed in 18 months, in time for the 150th anniversary of the creation of Cooper Union. A lofty goal, but one that has magic in it for the future of New York and the tech community within it.

I have been an adjunct professor here for about two years and have been amazed at the students and their work, especially their ability to work themselves to the bone when faced with almost impossible tasks. This semester, I got to instruct in "systems" in Mechanical Engineering and "emerging software paradigms" or Web 2.0, for short. The two classes were very different in style and content, but the students were the heart of what Cooper is proud to show as their very best.

ESC161 - Systems Engineering, Mondays 6-9pm
In a former life, I was a robotist - working on a new robotic manipulator at Stanford, building a control system that would allow a mechanical arm to work with incredible speed and precision. I have always been fascinated with "systems" and how they work, and I remember fondly my control theory classes (yes, sounds kind of strange, I know) where were would determine the equations that would govern the operation of a system and then design a control/compensator to deliver the performance we desired.

In my class here at Cooper, I spent a lot of time trying to understand the mechanics of systems so that my Mechanical Engineering students could understand how the physical world could be modeled by mathematics and how, when they take Controls next semester, they would understand how it all "works". It was a tough first half of the semester, and the first test was a killer. But, I was impressed with the students resolve to understand the material after the test, and how we - yes, me - worked to figure out how to get everyone to understand the material. The second test, which I am in the midst of grading, was longer, but fairer (IMHO). But, the thing that impressed me (and infuriated me) was the simulations that the students accomplished. No, they were not that magical. But the extra effort that some of the students made was extremely impressive, and made me smile that they had gone that extra mile. And I wanted to say thanks to them.

ECE463 - Emerging Software Paradigms/Web 2.0 Paradigms
This class was borne out of the Red Hen Spectra project - where I had built a team of students who are working on a collaborative curation project for chemical spectra (yes, extremely esoteric) but discovered that programming done in schools is primarily about simple problem solving with one customer, the professor. For my class, I chose to come up with a different customer, the owners of problems that could be solved by using Web 2.0 paradigms (e.g. collaboration, mashups, fast applications). The class was brand-new, and 10 students took on the challenge to help me build the course.

The first half of the course was a survey about the concepts of Web 2.0 and migrated to how to do rapid application development and product requirements. Instead of being handed an assignment, the students had to find "customers", address their problem, and define their own deliverables due at certain dates. We spent a bunch of time on processes, but we also did an excursion on user interface and usability so that, instead of a simple, command-line interface, the goal was to build a full application, from database to user interface - complete with wireframes, use cases, feature specifications and a marketing requirements document (ala Garage.com).

The final grade was determined on their product (did it work as specified) and their presentation to a group of investors, entrepreneurs and other interested parties. As you can see on the two articles, the students did exceptionally well:

I was very impressed by what the students have put together in the very short timeframe. Each of the ideas could be marketable with some tweaks. The teams focused mainly on what the tool would do, not as much on revenue or go-to-market plans.
I could tell that a couple of the students were nervous. I think colleges and universities need to offer more classes and make public presentations so that students can gain confidence in this area. This becomes more important as everyone wants to create the next Google. Pitching to a VC or other parties can be a make or break and these forced presentation opportunities are critical.
and from CNET,
This afternoon, I went over to NYC's Cooper Union to sit in on the final project presentations for the Web 2.0 Paradigms class, a hands-on course in the school's electrical engineering department taught by adjunct professor Sanford Dickert. In this course, the students--who were required to have software development experience--created their own Web applications from start to (very beta) launch, with a focus on the end user experience and what kinds of consumers would use such a service.
...
But who knows? These are projects that were conceived and launched in a span of six to eight weeks. The students clearly all knew what they were doing, development-wise. If they continue what they started, some of these could turn into interesting pieces of webware.
Caroline McCarthy, CNET: Webware, "Web 2.0 Gets Schooled"

When you get a chance to see your efforts turn out in such a way, all I have to say is thanks to all of these students for their hard-work and dedication. They did an incredible job and deserve the kudos they got in both of these articles.

Now, I have to finish grading these exams (yikes!) and get the scores in. Red Hen is still moving forward (we spent Friday cleaning the lab from top to bottom) and I am only now getting a chance to reply to emails and to put up this post.

Thanks to Caroline, Allen, all of our visitors on Wednesday and I wanted to make a special thanks to Profs. Fred Fontaine and Carl Sable who helped me put the Emerging Software Paradigms course together and Prof. Stan Wei for giving me the opportunity to teach the Systems Engineering course.

See ya after the summer. And keep an eye out for Red Hen Spectra.

Wednesday, February 7

What is the SPECIAL Project?

A couple of friends have noticed I have been out of touch for the past couple of weeks, especially with all of the energy going into the political sphere and the "most wide open race" in history. But the reason for this is a project I have been contributing my time to at Cooper Union. While teaching at Cooper Union as an adjunct, I found myself engaged in an interesting effort to help identify chemical spectra.

“What is chemical spectra?”, I hear you ask. Without getting too “technical”, I often explain chemical spectra as the “fingerprint” that chemicals “emit” when they are excited by an energy source. There are a number of different ways of generating a “fingerprint” from a chemical compound – either by illuminating it with some form of light (primarily infrared or IR), placing it within an electromagnetic field (nuclear magnetic resonance or NMR) or determining the molecular weight of the components of the material and determining the ratio of the components to determine the makeup of the compound (mass spectroscopy or MS). As you measure the compounds reaction to the stimulus across the frequencies (or wavenumbers), you see a response that is measurable and quantifiable. A great tutorial in some respects can be found at the Purdue Library site.

A more familiar analogy to this concept is the frequency response of a stereo speaker. When you speak to a stereophile, you will learn about how the speakers you choose can modify the "highs" and "lows". As sound energy is generated within the speaker, the physical characteristics of the speaker (e.g. speaker cone, the speaker housing, the magnetic coil) all contribute to the dynamic response of the speaker. If there is a vibrational mode (a point in the frequency range that causes the speaker to respond) within the range of hearing, then the dynamic response will be effected. The classic opera singer causing a crystal glass to shatter is a classic example of a vibrational mode going arwy.

Spectroscopy uses the same concept, but with different energy sources, and it measures the vibrational intensities of the chemical bonds that exist within a compound. Carbon-oxygen bounds vibrate at one frequency, Carbon-hydrogen bonds vibrate at another. Consider when compounds made up of a number of different springs, dampers and masses all have a different response to excitation. Now, consider that the compound is a three-dimensional object being measured in a two-dimensional fashion. This leads to a lot of interpretation.

Years ago, spectroscopy was considered an art – where IR spectra were collected bound into books and journals and researchers would pour over the images to see if there were similarities between their experimental data and the reference data collected in the journals. As we moved toward improved systems and accuracy (NMR is highly accurate and discriminatory), the electronics used in spectroscopy offer tools for prediction and identification of spectra based on the libraries contained within the spectrometer’s reach. But, the challenge has been, do you have enough spectra to be able to have an understanding of all potential spectra available?

A vision of a SPECTRAL Universe
While my background in electrical engineering and robotics, when I was a kid, I used to love playing with my chemistry set. I did not know about the concept of spectroscopy, and if I had when doing my PhD work, I might have laughed at how similar it truly is. I used to spend hours causing my robotic arm to vibrate at different frequencies so I could see the output response of the system and create its dynamic response. Being non-linear (linearity properties did not apply for all conditions) and time-variant (if the gears were warmed up after use versus just starting), the system had a reasonable linear response – which I would then characterize and attempt to identify. In spectroscopy, the same concept is used, but being a chemical compound, one is not always certain of the same conditions. The measurement of a compound’s spectra can have a number of sources of variance including:

  • Concentration levels versus contaminants
  • Instrumentation differences (e.g. resolution error)
  • Operator error (student versus a skilled technician)
  • Temperature
  • Humidity
  • ...and on and on...
While these might sound dreadful to come, I have come to think of these measurements as points in a mystery to solve what is the representation of a chemical compound in a particular spectral modality. Instead of a perfect spectra for ethanol, I believe that there are many versions of ethanol that people can measure in a particular modality that can be used to ascertain the “perfect” spectra. Somewhat like Plato’s concept of the “Beautiful” or the “Truth”, the perfect representation is always going to be just out of reach, while we continue to measure and attempt to find it.

Instead of the search for perfection, my belief is that accuracy can be found by the concept of clustering that all of the search engines and datamining companies use on a regular basis.

To bring “clustering” into perspective, think of the universe of spectra being infinitely vast and empty. For our simple discussion here, let us chose a chemical – say, ethanol – which, after being measured by the perfect spectrometer, has a spectra with three values (xe, ye, ze). Also assume that all spectra have a three dimensional point assigned to them (x, y, z). Now, ask a group of scientists (let’s say 50) to measure the spectra of ethanol (same composition, same concentration, everything). When all 50 points were gathered and plotted in the universe, you would see 50 points, primarily clustered around (xe, ye, ze) – which would create a likelihood of any other spectra that falls within that cluster to be ethanol.

This is the basis for my concept of the Spectral Universe.

Now consider that spectra can have over 2000 points or more to represent themselves. This “universe” could conceivably have over 2000 points to help identify in “space” which spectra is what. And, with samples from a large group of people, clusters of spectra would help identify chemical compounds – even ones that are corrupted by any number of variance factors. All by surfing the “spectral universe” to determine the most likely candidate. This was the inspiration of the SPECIAL Project.

The SPECIAL/Red Hen Project
Originally started as a Windows client to search your personal spectral library by Professor John Bove at Cooper Union, the SPECIAL Project evolved into a demonstration of collaboration and open data between chemical spectroscopists and scientists using technology to improve the identification of chemical compounds. Instead of relying on an absolute reference as has been managed by vendors and other providers maintaining a lock on innovation by controlling their intellectual property, the SPECIAL Project is meant as an opening – a marketplace to allow chemists and academics to work together to create a common spectral database and to build upon the platform to improve the capabilities within it.

Instead of relying solely on the innovation cycle of hardware vendors, chemists can develop new ways of evaluating chemical spectra and other chemical data to allow for richer discoveries through the aggregation of many sources of data into a single framework. Similar to what was discussed as .NET and "mashup" APIs, the SPECiAL Framework will allow disparate data sources to work in concert with other applications, once the interface and rules have been designed and developed to enable the protections and interfaces needed for the data source owner.

By respecting the intellectual property rights of the creator of the spectra (just like being the rights owner of any form of digital media like music or movies), the SPECIAL Project is designed to give a framework for database owners a chance to monetize their spectral data in a fashion that allows for per-use offering while also supporting the free exchange of spectral data amoungst members of the community. With an additional focus on interoperability with other software platforms and hardware systems (e.g. spectrometers have their own data storage format), the goal is to create a system that allows for mashups of hardware and software to create more compelling applications using spectroscopy.

When NASA sends off the Mars Scientific Laboratory in 2009, onboard will be a spectrometer the size of today’s Palm treo. Think about the possibilities if you could have your own personal spectrometer for your own evaluation of caloric content. Or for determination of breast cancer without invasive surgery (mammagrams in the privacy of your own home). Or spectral detectors of bomb materials as you walk through the airport security system more reliable than what is offered today.

Am I painting a future of the Star Trek tricorder? I sincerely hope so. But to get there, we have some technology research to do. We are very close. William Shatner - want to do another special on the effects of Star Trek?