Статьи

Как мы сделали Muletallica

Этот пост является вторым в серии из трех статей, посвященных проектам IoT, которые стали нашим первым внутренним хакатоном года и который мы продемонстрировали на нашей первой зоне интеграции вещей на CONNECT 2015. Нам там не хватало? Не беспокойтесь, вы не только получите представление о классных инсталляциях, которые мы создали ниже, вы также получите место в первом ряду о том, как мы их построили и с какими технологиями. Так что сейчас наденьте свои любимые джинсы рок-звезды и прыгайте в яму для мусора, чтобы узнать, как команда создала Muletallica, интерактивный визуальный / музыкальный опыт для наших участников конференции, которые подключали такие устройства, как Leap Motion и умные лампочки с электрогитарами, и болгарский перец.
11140242_10152941689612551_1656019084777378406_n-300x169

Почему мы это сделали

Muletallica вышла из внутреннего хакатона IoT, который мы провели в MuleSoft еще в апреле. Это был командный проект, созданный Федерико Амдамом, Джесикой Фера, Пабло Карбальо и мной, все мы работали в офисе Mulesoft в Буэнос-Айресе.

Muletallica родилась из чего-то, над чем я первоначально некоторое время возился: идея использования технологии для создания интерактивных музыкальных инсталляций, которые могли бы заставить людей — потенциально без каких-либо музыкальных знаний или навыков — испытать радость от создания музыки в веселый и творческий путь с минимальной кривой обучения.

Во время хакатона мы собирались выполнить некоторые из моих предыдущих музыкальных экспериментов и использовать Mule для их интеграции с интеллектуальными источниками света, таким образом, мы сделаем этот опыт намного более захватывающим и увлекательным. Благодаря добавленной визуальной обратной связи, которую обеспечивали источники света, ответы на действия людей стало легче связывать.

Перенесемся через месяц, и я был в Сан-Франциско на конференции Connect, представляя нашу команду. Тем временем наша маркетинговая команда помогла спроектировать потрясающую инсталляцию, чтобы показать Muletallica, которая действительно выделяла проект. И там я, по сути, воплотил мечту, поскольку (по крайней мере, пару дней) описание моей работы включало игру на гитаре и рассказывание людям о крутых технических игрушках. Любого, кто с любопытством подошел к стенду, пригласили присоединиться к группе и поиграть с нами, и им всегда было очень любопытно, как все это было достигнуто, поэтому я был рад показать им весь дизайн взаимодействия и основную архитектуру.

Для этого мы использовали светильники MiLight , которые предоставляют API-интерфейс Python, который не очень прост в использовании. Более известные светильники Phillips Hue предоставляют хороший REST API , который может легко получать HTTP-запросы … хотя это было бы слишком просто. Мы хотели испытать что-то, где мы могли бы продемонстрировать способность Мула взять уродливый устаревший интерфейс и сделать его пригодным для использования, поэтому вместо этого мы выбрали MiLight.

Muletallica, Play By Play

В приведенном выше видео вы можете увидеть, как я играю на нескольких разных инструментах. Каждый инструмент связан с различным интеллектуальным светом и устанавливает его оттенок и интенсивность с помощью сообщений Mule:

  • The Air Piano: At first, I play a Leap Motion sensor as an air-piano that can be played by simply stroking imaginary keys. The beauty of this is that whatever you play, it’ll always be in the right key and adjusted to be right on the beat. Literally anything you play will always sound musically good, or at least not painfully off. At the same time, we had a light flicker once for every note that is played, with a hue that was mapped to the note’s pitch.
  • The Guitar Duel: When playing the guitar, the sequence of notes I played is stored, so that playing the air piano automatically runs you through the same sequence of notes. This made for a pretty interesting request-response kind of musical conversation between two instruments, where the notes would be the same but the free interpretation of the timing of them was enough to allow for some exciting musical expression. It was also a fun way to interact with members of the audience who were brave enough to accept the challenge of playing back whatever I played. One of the lights was mapped to the guitar and flickered with every note I played, mapping its hue to note pitch.
  • Adding Beats: When presenting one of a series of printed cards to the webcam on the laptop, the drum pattern changes. Here the computer is using Reactivision, a computer vision software that was originally built for the Reactable, to recognize these cards. Using this in our setup was a little tribute to the creator of the Reactable, Sergi Jordá, a professor of mine who first inspired me to pursue this ideal of making music creation accessible to everyone. One of the lights flickers matching the beats of the drum, mapping intensity of the beat to luminosity, it also changes color whenever the pattern changes. Each change in the drum pattern also triggers the playing of a short three-second video.
  • The Wiiii and the WubWub: After having changed the drum beat to the most electronic pattern, playing the same Leap Motion Sensor as before invokes a dubstep-ish theremin-like instrument that responds to the height and angle at which you hold your hand above the sensor. It can actually tell what hand you’re holding up and plays a different instrument depending on which it sees. I called one of these instruments “Wiiii” and the other one “WubWub” …I suppose you can easily tell which is which from the video. Every change in these instruments was also manifested through a change in the hue of its corresponding light.

 Image title

  • The Bell Pepper: Our addition of a music-making vegetable piqued a lot of people’s curiosity from the visitors. It was an actual vegetable that was wired to a Makey Makey, and responded with a chord change every time someone touched it (going through a sequence of pre-defined chords). Yes, touching the bell-pepper involved an –imperceptibly low– electric current passing through your body. Some people seemed to be a little uneasy about this idea, I would then assure them that the bell-pepper we were using was 100% organic, fair-trade, fresh produce with no additives whatsoever, and then proceeded to show them the sticker that certified that it was in fact organic. One of the lights changed color whenever the bell pepper was touched.
  • The music that could be made with Muletallica was far from anything that could resemble the sound of Metallica… it could be described as mellow Pink Floydish trance-inducing prog-rock or sometimes as full-on twisted synthetic-sounding dubstep, but certainly never as heavy metal or anything even faintly close to that. We came up with the name as a random pun that we never expected would be taken seriously as a proper name, but people seemed to like it quite a bit, and so we went with it… it’s like what they say, if you build it, they will come.

    Muletallica’s Backstage

    All of our Integration of Things Zone projects featured Mulesoft products in their internal structure in some way or another. In the case of Muletallica, I must admit that Mule was not the backbone of the project, but still an essential bone its structure.

    The backbone was Usine, a not-so-well-known French software that is actually amazingly versatile and ideal for live performances like this. It shares a certain philosophy with Mule, as with it you’re also building flows by dragging and dropping atomic components that include all kinds of connectors and transformers. Just like in Anypoint Studio, everything is exposed through a graphical interface, while you can also get into the code and write.

    Most of the external components involved were connected together through MIDI, which is a widely accepted standard in musical interfaces. Due to the prevalence of that standard, connectivity was not a challenge when communicating Usine to Reactivision or to Mogees. The lights we used, however, didn’t support MIDI or any other universal standard for that matter, and so that’s where we had to truly put our integration developer hats on and solve the puzzle.

    We then built a RAML definition that exposed a series of methods for calling our lights, and with that in place it was really easy to just build an APIkit project and have it automatically flesh out all of the scaffolding we would need to build a neat RESTful wrapper around their ugly API. We then injected a few lines of python code into Mule, these executed the commands that made up the MiLight API, as well as the commands of a python MIDI library that allowed us to receive the MIDI messages Usine sent and make them into Mule messages.

    The RAML definition we wrote for wrapping the miLight API in a REST API:

    #%RAML 0.8
    title: muletallica
    version: 1.0.0
    baseUri: http://server/lights/{group}
    /effects:
      displayName: effects
      /{group}:
        displayName: group
        /gamma:
          displayName: gamma
          put:
            description: change color gamma in a group of lights to any color
            body: 
              application/json:
                example: |
                 {
                   "note": 1
                 }
        /directcolor:
          displayName: direct color
          put:
            description: change color gamma in a group of lights, to predefined colors
            body: 
              application/json:
                example: |
                 {
                   "note": 32,
                   "velocity": 100
                 }
        /intensity:
          displayName: intensity
          put:
            description: change brightness in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "velocity": 1
                 }
        /both:
          displayName: both
          put:
            description: flicker with color and intensity
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }
        /flicker:
          displayName: flicker
          put:
            description: make a group of lights flicker
            body: 
              application/json:
                example: |
                 {
                   "note": 1
                 }
        /wiii:
          displayName: wiii
          put:
            description: make wiii effect in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }
        /wub:
          displayName: wub
          put:
            description: make wub effect in a group of lights
            body: 
              application/json:
                example: |
                 {
                   "note": 1,
                   "velocity": 1
                 }

    The XML of our Mule flows. Much of this was automatically built by APIkit from the RAML file above:

    <?xml version="1.0" encoding="UTF-8"?>
    
    <mule xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns:scripting="http://www.mulesoft.org/schema/mule/scripting"
    xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:apikit="http://www.mulesoft.org/schema/mule/apikit" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:spring="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/scripting http://www.mulesoft.org/schema/mule/scripting/current/mule-scripting.xsd
    http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
    http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
    http://www.mulesoft.org/schema/mule/apikit http://www.mulesoft.org/schema/mule/apikit/current/mule-apikit.xsd
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
    http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd" version="EE-3.7.0">
    
    <http:listener-config name="api2-httpListenerConfig" host="localhost" port="8081" doc:name="HTTP Listener Configuration"/>
    <apikit:config name="api2-config" raml="api2.raml" consoleEnabled="true" consolePath="console" doc:name="Router"/>
    
    <apikit:mapping-exception-strategy name="api2-apiKitGlobalExceptionMapping">
    
     <apikit:mapping statusCode="404">
      <apikit:exception value="org.mule.module.apikit.exception.NotFoundException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Resource not found&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="405">
      <apikit:exception value="org.mule.module.apikit.exception.MethodNotAllowedException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Method not allowed&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="415">
      <apikit:exception value="org.mule.module.apikit.exception.UnsupportedMediaTypeException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Unsupported media type&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="406">
      <apikit:exception value="org.mule.module.apikit.exception.NotAcceptableException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Not acceptable&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
     <apikit:mapping statusCode="400">
      <apikit:exception value="org.mule.module.apikit.exception.BadRequestException" />
      <set-property propertyName="Content-Type" value="application/json" doc:name="Property"/>
      <set-payload value="{ &quot;message&quot;: &quot;Bad request&quot; }" doc:name="Set Payload"/>
     </apikit:mapping>
    
    </apikit:mapping-exception-strategy>
    
    <flow name="api2-main" processingStrategy="non-blocking">
     <http:listener config-ref="api2-httpListenerConfig" path="/api/*" doc:name="HTTP"/>
     <apikit:router config-ref="api2-config" doc:name="APIkit Router"/>
     <exception-strategy ref="api2-apiKitGlobalExceptionMapping" doc:name="Reference Exception Strategy"/>
    </flow>
    
    <sub-flow name="pythonApi">
     <scripting:component doc:name="Python">
      <scripting:script engine="jython" file="/Users/nearnshaw/muletallica/mule.py">
       <property key="group" value="#[group]" />
       <property key="command" value="#[command]" />
      </scripting:script>
     </scripting:component>
    </sub-flow>
    
    <flow name="put:/effects/{group}/gamma:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="0" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/directcolor:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="6" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Change Direct Color&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/both:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="5" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Change Both&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/flicker:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="1" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/intensity:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="2" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/wiii:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="3" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    <flow name="put:/effects/{group}/wub:api2-config" processingStrategy="non-blocking">
     <object-to-byte-array-transformer doc:name="Object to Byte Array" />
     <object-to-string-transformer doc:name="Object to String" />
     <set-variable variableName="command" value="4" doc:name="Variable" />
     <flow-ref name="pythonApi" doc:name="Call Python Api" />
     <set-payload value="&quot;Called Gamma&quot;" doc:name="Set Payload" />
    </flow>
    
    </mule>

    This project allowed us to show off Mule’s speed and stability when dealing with a massive stream of requests that arrived simultaneously. In music, timing is the single most important thing, as the slightest delay renders an interface unusable for musical interaction, that’s why music is the ultimate challenge for testing the real-time readiness of a system. We did have a few problems with delays at first, but we soon realised that the bottleneck was actually our wifi signal, not Mule. With that fixed, we got to the point where delays were virtually imperceptible. The music software we were running is pretty heavy on the machine’s resources, and we were running Mule in that same laptop computer …even then we didn’t experience any significant delays.

    Looking forward, it would be amazing if someone took the time to build a MIDI connector for Mule, with that in place this entire project could have been built around Mule, controlling even the triggering of musical notes and everything else… I really look forward to doing that some day!