Extraterrestrial Life

We had a brief conversation about extraterrestrial life in class today, and I immediately thought of this article on the Fermi Paradox. It gives a great summary of the arguments for why we haven’t heard from extraterrestrial life-forms yet, why they might exist, and why they might not: http://waitbutwhy.com/2014/05/fermi-paradox.html

Transcoding and Protein Music

Partners: Gus Kitchell & Charles Elmer

Overview

 This project was my first stab at transcoding – converting one source of information into another format. In this case, we converted biological information (the amino acid sequence of a protein) into an auditory output. The specific protein we chose was the ZENK protein (also known as the Early Growth Response Protein 1, or EGRP1) found in the forebrain of the Zebra Finch (Figure 1), a species that has been studied extensively and for which a great deal of genetic information is available. The ZENK protein is involved in vocal communication, and studies have suggested that its production may be triggered when songbirds hear the songs of other individuals of the same species (Mello and Ribeiro, 1998). For this reason, we saw a parallel between the function of the ZENK protein and the goal of our project.

zebra-finches

Converting biological concepts or transcoding biological information into music has been done by a variety of artists – we are certainly not the first to take on this project. However, we feel that the music produced by these projects tends to be rather abstract and cacophonous. Perhaps this should not be a surprise; there is no inherent reason that a sequence of amino acids should magically transform into a chart-topping pot hit. However, Charles and I were still very interested in the idea of protein music, and thought that it might be beneficial to produce a musical piece that is more accessible for modern listeners. In short, we wanted to create something that sounds more like a popular song that you might hear on the radio, while still using biological information as the foundation of our project. Our reasoning was fairly simple: although we were intrigued by the dissonant music produced by past transcoding artists, we found it hard to understand. For the average listener, it can be hard to understand the connection between the dissonant sounds and their biological source material. As a result, they may feel overwhelmed and ultimately uninterested. It is hard to interact with something which you don’t understand. Thus, we sought to produce music that would be more familiar to the average listener, and might spur them to ask questions. What protein sequence did you use to create the bass-line? How can the same sequence be used to create a guitar melody and a drum solo? How is it possible to convert a protein sequence into a catchy song? While these questions still show a general lack of understanding from the listener, we hope that they can at least conceptualize what they are listening to. If we have done our job well (that is, produced a catchy tune), we hope that it will prompt more listeners to engage with the idea of transcoding and protein music. Thus, the ZENK protein, which is involved in vocal communication in the Zebra Finch, seemed like a particularly suitable foundation for our project, which attempts to communicate the concept of protein music to a general audience.

Methodology

We began by downloading the amino acid sequence for the ZENK protein, found on the RCSB Protein Data Bank.

CDRRFSRSDE LTRHIRIHTG QKPFQCRICM RNFSRSDHLT THIRTHTGEK PFACDICGRK FARSDERKRH TKIHLRQKDK KVEKAAPAST ASPIPAYSSS VTTSYPSSIT TTYPSPVRTA YSSPAPSSYP SPVHTTFPSP SIATTYPSGT ATFQTQVATS FSSPGVANNF SSQVTSALSD INSAFSPRTI EIC

(http://www.rcsb.org/pdb/protein/O73693?evtc=Suggest&evta=ProteinFeature%20View&evtl=OtherOptions)

The sequence is listed in the conventional amino acid “single letter code,” which is listed below.

Single Letter Code

  • G – Glycine (Gly)
  • P – Proline (Pro)
  • A – Alanine (Ala)
  • V – Valine (Val)
  • L – Leucine (Leu)
  • I – Isoleucine (Ile)
  • M – Methionine (Met)
  • C – Cysteine (Cys)
  • F – Phenylalanine (Phe)
  • Y – Tyrosine (Tyr)
  • W – Tryptophan (Trp)
  • H – Histidine (His)
  • K – Lysine (Lys)
  • R – Arginine (Arg)
  • Q – Glutamine (Gln)
  • N – Asparagine (Asn)
  • E – Glutamic Acid (Glu)
  • D – Aspartic Acid (Asp)
  • S – Serine (Ser)
  • T – Threonine (Thr)

We then ran the amino acid sequence through a MIDI note converted created in MAX by our professor, Tim Weaver. This program reads the single letter code of the amino acid sequence and converts each letter into a MIDI note. Factors such as the volume, duration, note range, and instrument type can all be altered in the MAX patch (Figure 2).

MaxPatch

Figure 2. ZENK protein not conversion MAX patch

In order to create a melody that sounds more similar to a pop song, we converted each amino acid into a note within a G pentatonic scale. MIDI operates in a 127 note chromatic scale, meaning a random assortment of notes tends to sound a-melodic. Thus, we only assigned amino acids to MIDI notes that fall within a G pentatonic scale. This required us to look up the MIDI note conversion chart (Figure 3).

logic-midi-note-numbers

Figure 3. MIDI note conversion chart.

https://freaksolid.files.wordpress.com/2013/03/midi_note_values.jpg

As noted above, we assigned each amino acid in the ZENK sequence to a note that falls within the G pentatonic scale (G, A, B, D, E, G), beginning with a 2nd octave G (MIDI note #55). The result was a text sequence that looks like the one shown below.

A, 55;

R, 57;

N, 59;

D, 62;

C, 64;

E, 67;

Q, 69;

G, 71;

H, 74;

I, 76;

L, 79;

K, 81;

M, 83;

F, 86;

P, 88;

S, 91;

T, 93;

W, 95;

Y, 98;

V, 100;

However, due to the arrangement of the amino acid sequence, this note assignment still produced consecutive notes that were multiple octaves apart, creating a jumpy and disconnected tune. To correct this, we took a simple shortcut, assigning the same set of notes to multiple amino acids. We selected a single octave, from G to G (#55 to #67), and created the text file shown below.

A, 55;

R, 57;

N, 59;

D, 62;

C, 64;

E, 67;

Q, 55;

G, 57;

H, 59;

I, 62;

L, 64;

K, 67;

M, 55;

F, 57;

P, 59;

S, 62;

T, 64;

W, 67;

Y, 55;

V, 57;

Once our notes had been restricted to a single octave (minimizing the large jumps between consecutive notes), we played the protein sequence and recorded the output in the music-editing program Logic (Figure 4). This allowed us to arpeggiate notes and add a variety of other effects in order to create a more modern sound (Figure 5). We repeated this process using multiple different instrument sounds, then overlaid each track to create a song that involved drums, bass, guitar, and synth, in a variety of octaves and tempos.

Screen Shot 2015-05-26 at 12.50.24 PM

Figure 4. Logic screen

Screen Shot 2015-05-26 at 12.50.40 PM

Figure 5. Arpeggiator screen (within Logic)

Overall, we found this project to be both difficult and enjoyable. Creating a pop song prom a protein sequence is not an easy process. While our first attempt may not be a #1 hit, we think it is a decent first demonstration of the overlap between protein sequences and popular music, and hope that it will generate new interest in transcoding and protein music.

Cool Advances in Biosensing

We’ve been talking about biosensing in class, and I wanted to take a minute to post about a couple of cool advances that this field has seen in the last few years. Both are related to sending brain signals and muscle impulses.

The first is a video of electric signals from the brain being shared between two different people. Muscle sensors are attached to the nerve that controls the motion of the bottom three fingers on the hand, and the signal from one persons brain is transferred to both peoples arms. Here’s the link to the video:

The second project involves two researchers who were able to send brain signals back and forth over the internet in order to control the muscles movements in one another’s hands. This is some seriously sic-fi stuff, with some crazy implications. My mind tends to stray towards thoughts of a super villain that is able to send brain signals over the internet to control the population, but I’m sure there are practical applications as well.

Here’s the link to the article: http://www.huffingtonpost.com/2014/11/08/brain-interface_n_6115334.html

And the video: https://www.youtube.com/watch?v=xRsx5egJoYk

Biosensing Final Project: the muscle-driven Hexbot

Team: Angus Kitchell, Charles Elmer, Ross Mansfield

Project Abstract:

With this project, we aim to create an interactive experience that is both fun and thought provoking. Muscle impulses from the drummer inspire the hexbot to move forward, while hits registered from the sticks influence the direction it travels in. People are moved in many ways by music, whether physically or emotionally, or both. To view the hexbots moved by the biosensed impulses of a drummer’s arm and see it dance to the rhythm is to reimagine our understanding of music. We can become the influencer and the observer- and yet still partake in the experience.

This technology could also be viewed as a way to create a multimedia art piece. While we chose to have our muscle imposes control the movement hexbot, these impulses could be transcoded into a wide array of other formats to create a visual output. So, with muscle sensors hooked up to a drummer (as they are in this demonstration), the muscle impulses involved in the drumming process could be converted into an accompanying visual output that is intrinsically tied to the audio output. Thus, a dynamic multimedia piece is born, creating a bi-sensory experience for the audience. By presenting the muscle impulses of a drummer in a visual format, the audience can suddenly experience and appreciate the skill of the drummer (typically evaluated as an auditory output) in a whole new medium, potentially expanding their enjoyment and understanding of the performance. In addition, this concept could expand far beyond drumming. For example, this technology could be used to convert the muscle impulses from a soccer player’s calf, a weightlifter’s biceps, or an orchestra conductor’s forearm into a musical number, a visual display, or a tactile experience.

hexbot-doc

Project Description:

The key biological component of this project consisted of electromyography, which measures the electric potential associated with muscle activity. We began by testing a muscle sensor (Muscle Sensor v3, available on Sparkfun: https://www.sparkfun.com/products/13027), adjusting the positioning of the sensor pads and the sensitivity of the board until we settled on a combination that produced a consistent and controlled output. Given that our demonstration utilized drumming as the means to control the hexbot, we experimented with multiple muscle groups that are active during a drummer’s motion, ultimately targeting the Brachioradialis muscle on the inner forearm.

To convert the muscle impulses into directions for the hexbot, we downloaded an example Arduino code and hacked it to fit the needs of our project. This involved setting a threshold voltage to initiate the command for forward movement and adjusting the regularity of the signal so that the hexbot would respond to muscle activity with minimal lag between the signal and the action. Thus, the regularity of the signal had to be frequent enough that the hexbot could respond to multiple commands made in quick succession (or stop moving shortly after the last command was issued), but with enough space between signals for the mechanical aspects of the hexbot to register and respond to each signal.

Signals were transmitted to the hexbot via the hexbot remote, which we wired to the output ports of the Arduino board. This required a wires to be soldered to the contacts for each command on the remote, allowing the outgoing signals from the Arduino board to be transmitted through the remote’s LED transmitter. Thus, while the remote was still functional, we simply used it as a “transmitting tower.” When the incoming signals from the muscle sensor reached the required threshold, an outgoing signal was sent through the remote to command to hexbot to move forward.

Side-to-side motion was controlled by the vibrations created by drumming. In this case, we placed a piezo element (https://www.sparkfun.com/products/10293) between two notebooks (acting as our drum pad), and wired it to the Arduino board. Vibrations registered by the piezo element were transcoded into a signal that commanded the the hexbot to turn right. While we had hoped to have one piezo element for right turns and another for left turns, we were only able to use one successfully, resulting in a hexbot that only turned right. Again, the outgoing signal created by the Arduino board was wired directly to the “right” command on the hexbot remote, such that any vibrations that passed the command threshold caused to hexbot to turn right.

We used a breadboard to facilitate the wiring involved in this project, connecting the muscle sensor to its batteries, as well as the piezo element and the Arduino board. A basic summary of the wiring for the muscle sensor can be seen in the figure below.sensordiagram

Personal Contribution: 

This was definitely more of a learning project than a leading project for me. As someone who has little experience in coding and wiring, my ability to help on a technical level was fairly limited. However, our partner Ross is a very capable technician who found time to explain some details to me as he was working. As a result, I gained a basic conception of the processes required to implement a project of this nature, as well as a better understanding of how to actually perform the technical aspects. For example, I was introduced to coding when Ross walked me through the sample code for the Arduino board that he had taken from Sparkfun.com and hacked to suit the needs of our project. He then showed me how to upload a program from a computer to an Arduino board, which is the crucial step in transcoding incoming biological signals (muscle impulses) into outgoing digital signals. Given that this required multiple pieces of technology to work in unison with each other, I was also introduced to some basic wiring and circuity, including the value of a breadboard for connecting multiple pieces of technology into a single circuit. I was able to absorb some information as I watched Ross disassemble the Hexbot remote and solder new wires onto the command contacts. Since my ability to help with technical aspects of this project was fairly limited, I volunteered to act as a testing dummy for the muscle sensor, helping Ross to troubleshoot the system when we came up against early difficulties with the equipment’s sensitivity (at first un-responsive, then overly sensitive). Again, I experimented with multiple muscle locations before settling on the forearm as an effective muscle group for our activity (drumming). At the end of the project, I took responsibility for writing the project description. This served as a good way to recap all that I had learned through this project, and present the multiple steps in a cohesive format.
Overall, my greatest contribution was in the conceptual phase of this project. My background in ecology has had a profound impact on my worldview, and I enjoyed the opportunity to incorporate biological systems or properties with technology. This project was a great example of the value of multiple persepectives/backgrounds in a group project, and was one of most rewarding experiences I have had with an interdisciplinary course. During the “initial ideas” phase of this project, I came up with a variety of ways in which I wanted to see the biological world integrated with the digital world, and Ross and Charles were able to take my ideas and explain how they could be implemented with technology. For example, I expressed an interest in creating a robot that would interact with its environment in a biological manner, responding to stimuli and acting without the need for a human to issue commands. In response, Ross and Charles suggested that we could utilize the infrared signaling capabilities of the Hexbot by placing it within a network of infrared “pillars” or “signal towers” with which it could interact, sending and receiving signals and changing its movement in response to the position of other objects in its environment. As an alternative, Ross suggested that we could attach a sensor to the bottom of the Hexbot that would allow it to register differences in light/color, allowing it to follow (or avoid) paths or markings that we created on its walking surface. For example, using a sharpie to draw a dark black path (or obstacles) on a large sheet of white paper, we could create an “environment” for the hexbot to interact with. Thus, this bio-sensing project was almost like a meta-project for me: as I developed ideas with my biology-minded brain, Ross and Charles “transcoded” those ideas into technical versions, allowing them to be implemented in class.

Biosensing: Artist’s Statement

This brief statement introduces the biosensing project that we’ve been working on for the last few weeks. Its the final product of the jumble of ideas that I posted earlier, and we managed to stick to the original plan fairly well. Pictures / video from our demonstration will be posted in the next few days!

With this project, we aim to create an interactive experience that is both fun and thought provoking. Muscle impulses from the drummer inspire the hexbot to move forward, while hits registered from the sticks influence the direction it travels in. People are moved in many ways by music, whether physically or emotionally, or both. To view the hexbots moved by the biosensed impulses of a drummer’s arm and see it dance to the rhythm is to reimagine our understanding of music. We can become the influencer and the observer- and yet still partake in the experience.

This technology could also be viewed as a way to create a multimedia art piece. While we chose to have our muscle imposes control the movement hexbot, these impulses could be transcoded into a wide array of other formats to create a visual output. So, with muscle sensors hooked up to a drummer (as they are in this demonstration), the muscle impulses involved in the drumming process could be converted into an accompanying visual output that is intrinsically tied to the audio output. Thus, a dynamic multimedia piece is born, creating a bi-sensory experience for the audience. By presenting the muscle impulses of a drummer in a visual format, the audience can suddenly experience and appreciate the skill of the drummer (typically evaluated as an auditory output) in a whole new medium, potentially expanding their enjoyment and understanding of the performance. In addition, this concept could expand far beyond drumming. For example, this technology could be used to convert the muscle impulses from a soccer player’s calf, a weightlifter’s biceps, or an orchestra conductor’s forearm into a musical number, a visual display, or a tactile experience.

Parable of the Polygons – On the Shape of Society

This post is about a website that we were shown in the first week of class, in the context of emergence and organization in biological systems. . The topic has been on my mind lately, so I thought I’d repost it. Here’s the website link: http://ncase.me/polygons/

This website provides a model for how small individual biases in society interact with one another to create segregation. While it doesn’t necessarily describe an ecosystem, it is a good visualization of how a clearly visible order is created in a “bottom-up” manner from many individual actions. Using society and segregation as an example also helps to explain emergence in terms that are easily understandable, which makes this a powerful exercise.

Bio-sensing Project Ideas

We’ve moved on from biological form to biosemiotics, which can be simply defined as the study of signs and codes in the biological realm. Over the next few weeks, we’ll be working on biosensing projects, with the class divided into groups of three. I’ve decided to write down some of my groups initial project ideas, just because putting things in words helps me to think through the different components of a project. Ultimately, we’ll pursue one or two of these ideas in our project, but I anticipate that our idea will pivot along the way as we encounter new inspiration or find that certain aspects are difficult to execute.

Project Ideas:

  • Playing off the idea of a robot powered by sun that constantly searches for the sun, try to create a robot that has some sort of biological sensing capacity and acts in a “biological” way. That is, you are not telling it what to do, it is doing what it needs to do to survive. This requires it to sense its environment (either send or receive signals of some sort) and react accordingly.
    • Working with hexbots linked to a brain or muscle input, we can create certain thresholds that correspond to actions (i.e., over time, if the brain input is above some level, that corresponds to movement). We also use the infrared sensors on the hexbots to interact with its environment. So, we can put infrared “pillars” in the environment that either attract or repel it (mimicking a food source or a predator), and then we have an interaction between the command signals (which are originating from our brain or muscles) and the environment (which it interacts with in a roughly “biological” way). Our impulses give the command to move or stay still (we can program certain thresholds to command for left or right, etc.), and these movements are placed in the context of the infrared “environment” that we create. This is the “biological robot” aspect of this project, in which it is interacting to its environment.
    • We could also create dark and light paths and obstacles on a large sheet of paper, and have the robot sense light vs. dark and be attracted to one. Then, as it is receiving the different inputs and moving around in response to them, it is also responding to its environment, either following or avoiding the light and dark paths and obstacles. We could still keep the metaphor for a biological system, where you make some obstacles represent food or predators.
  • Using a headset, record/measure human brainwaves when playing different drumbeats and rhythms. Then, convert this output to movement signals for a hexbot. The hexbot would be fitted with a pen to give a visual display of the brainwave output. I like the broader idea of converting music into a visual output based on human response to it. I’d like to see how different genres of music trigger different responses in the brain, and how those are converted into a visual display (a picture or a video). This doesn’t necessarily need to use a hexbot and pen as the mechanism to convert the brainwave output to a visual output.
    • When we get the impulse measurements from our brain or muscles, it will come in as a series of numbers. We can choose to plot these over time, creating a wave form, or we could turn these numbers into some other type of visual output. Theoretically, our muscle or brain responses would differ as we listen to different genres of music, and this would be reflected in the different data and thus different visual output.

I’ll try to update this post as we start our project and alter these original ideas.

Technology, Evolution, and the Pursuit of Perfection

This post is a response to ideas posed in Jussi Parikka’s “Insect Media.” Below is a series of questions and ideas that I developed while considering the relationship between technology and evolution. With a background in ecology, I tend to look at most things from an evolutionary perspective, which brings up some interesting conflicts when paired with technology.

Original Post 

On page 91, Parikka states that “variation acts as the force transversal to the technological, animal, and creative worlds.” In other words, it is the intersection between these three entities. However, I find myself disagreeing with this analysis. In my opinion, the technological world is severely lacking in variation. In our modern industrial economy, it seems that we value uniformity more than variation. Starting with Henry Ford and the production line, out industrial system is designed to mass-produce objects that are identical and form and function, thereby maximizing efficiency and minimizing cost. In my opinion, this prioritization of uniformity represents a fundamental deviation from the natural world, where variation is the basis of life. Variation in traits is one of the three fundamental pillars of natural selection (along with heritability and differential survival associated with these traits). Evolutionary forces can only “act” upon the natural variation that exists in a population, meaning variation sits at the very core of speciation and the entire range of life that exists today.

The absence of variation in technology poses a number of further questions:

1. If we intentionally incorporated variation into technology (i.e. create slight variations in each individual of copy of a given product — for a cellphone, this could be a slightly bigger screen, slightly smaller buttons, better or worse screen resolution or sound quality, etc.), could we create a sort of technological evolution in which we artificially select for the “traits” that we find most beneficial. With human preference acting as the selective force in this system, would it be possible to create a dynamic system in which technology improves in new and unforseen ways. When it comes to technology, I think we have a clear conception of what is “better” — we always want devices to be faster, smaller, more powerful, etc. However, this is ultimately a matter of opinion. By introducing variation and then acting upon that variation with a selective force, it is possible that we would recognize benefits to technology that don’t necessarily align with our initial assumptions (that is, maybe we realize that there are situations in which having a technology that is slower or less powerful is actually more beneficial in some way, or fills some niche that was previously empty.)

2. The comparison between technology and evolution makes me begin to consider some big-picture questions about the inherent direction of nature and life. We tend to view technological advances as a sign that we are becoming more sophisticated, ultimately moving closer to some definition of “perfection”. However, as I mentioned earlier, technology doesn’t fit within the natural world. It is not impacted by evolution. Personally, I think a strong case can be made for the system of evolution as the ultimate manifestation of “perfection”. It is dynamic and ongoing, encompassing all biotic entities on earth. In this way, it ensures that the entire system is constantly changing in harmony. Any action by one organism has an impact on another, and the system of evolution regulates these interactions in a way that ensures the continued existence of life (although species may constantly be going extinct). Thus, the system ensures its own continuation. However, technology doesn’t really fit into this system, because it is abiotic. It doesn’t interact with other organisms within the context/rules of the system. While technology can impact organisms within the system, these organisms don’t really impact technology on an equivalent scale. In this way, the delicate balance created by evolution is thrown off by the introduction of technology. This brings up a new question:

3. Are we really “advancing” by developing new and powerful technologies? We are, in the sense that we are improving the quality of human lives, but I think its pretty clear that this improvement comes at a cost to the overall system, which is in some way damaged by technology. So, by developing and implementing technology in our lives, we are introducing a new force into the system of evolution — a force that acts on the system, but is not acted upon. In this way, we may be threatening the continued existence of life.

More Biological Form

Mammoth BeeMammoth Bee 1

This was a quick side-project that I created while working on modeling biological form. It uses the same model of a Eulaema bee, but I’ve added various appendages from other animals. It has the claws of a blue crab, the tusks of a wooly mammoth, and a proboscis that is actually a fossil of a dolphin skull. I created this model as a way to familiarize myself with Blender, so this piece doesn’t have much of an explanation. I’m still working on creating an ecologically plausible explanation for why natural selection would produce a bee with so many complicated appendages — the most logical reason would probably be a complex case of co-evolution with a flower that has corresponding parts. From an evolutionary standpoint, the evolution of specialized pollinators helps a flower to ensure pollination; if a plant has many “generalist” pollinators, there is potential for those pollinators to transport pollen to an incompatible plant species, which is energetically inefficient for the original plant. Thus, natural selection can drive the development of complex “defenses” that only specialized pollinators can breach. In this way, even though co-evolution between a plant and a specific pollinator may inhibit access by other pollinators, it increases the likelihood of successful pollination by creating a guaranteed relationship with a single pollinator species (which in turn only pollinates a single flower species).

If I have some spare time, I’ll try to create a model of a flower whose pollination would necessitate the complex appendages shown in the bee model above.