top of page
ezgif.com-video-to-gif.gif
ezgif.com-video-to-gif (1).gif
ezgif.com-video-to-gif (2).gif
ezgif.com-video-to-gif (3).gif
ezgif.com-video-to-gif (4).gif

5-SQUARE

Made with Max Horwich and Arnav Wagh | Year: 2017 

Took charge of the wearable technology and created graphics with p5.js coding

Special thanks to Yotam Mann, creator of Tone.js

A gestural graphic interface, made for non-musicians, to create looping music with simple clicks and drags, while manipulating it with a physical glove. Final project for ICM with Allison Parrish, and Physical Computing with Daniel Rozin.

Tools: Javascript (p5.js and tone.js), Adafruit Flora, Physical computing

 This project consists of
1. Music generation - coded in Tone.js library
2. Interactive animations for visuals - p5.js
3. A glove that uses flex sensors to add effects to the music

The 5 Squares:

Snare

 

Piano Melody

 

Bass

Beat

 

Lead Melody

MiMu-gesture-control-glove-by-Imogen-Hea
kandinsky.gif

Inspiration and Ideation

With our combined experience in music theory, creative code, interactive graphics and wearable tech, coupled with a passion for creating a fun musical project, we settled on the goal: Create a graphic and musical interface that uses gloves to enable inexperienced people to create, enhance and play music

After brainstorming for about a week and letting our simple ideas multiply and evolve into something that stays faithful to our project goals, and researching the scope and methods of other similar projects, we decided to take on a multi-faceted, holistic music creating interface that lets the user draw basic music notes in the form of doodles on a screen, and enhance it with the glove. The program will take inputs from the coordinates of the drawings to play and loop it, and from the sensors in the glove to add drastic effects, both musical and visual, to show the fun and the power in the art form itself.

Our inspiration through this research has been the Mi.Mu glove made by Imogen Heap, a glove that records, loops, plays and adds effects to sounds and music, while the user performs.

For the interface on the screen, our designs are still coming together, but the current target is around halfway between a traditional step sequencer and Kandinsky from the Chrome Music Lab Experiments. We researched a new library of functions for our project, Tone.js, and learned about its abilities and pitfalls.

Process

Graphic Interface

Musical programming and creative coding for graphics

Using JavaScript we coded in the p5.js editor and added musical functionalities using simple oscillator functions (osc), initially experimenting with the following sketch on the right that uses this library to play beats at different frequencies and note lengths, and a sketch that creates markers

to let the user draw, and maps sounds to each coordinate (and adds it to an array).

Essentially, the five squares contribute to the music flow like this:

On Loop (with 4 consecutive notes of increasing frequency):

Snare - Beat - Piano - Bass

Freehand drawing (the fifth, larger square): 

Lead Melody

Using tone.js library, and coding it to loop with sliders freehand drawings. Here are a few snippets of the code:

ezgif.com-video-to-gif.gif
ezgif.com-video-to-gif (1).gif
ezgif.com-video-to-gif (2).gif
ezgif.com-video-to-gif (3).gif

var snare = new Tone.MetalSynth({
  volume: -10,
  frequency: 60,
  envelope: {
    attack: 0.001,
    decay: 0.4,
    release: 0.2
  },
  harmonicity: 5.1,
  modulationIndex: 1,
  resonance: 800,
  octaves: 1.5
}).connect(snareFilter);

​var bass = new Tone.MonoSynth({
  "volume": -10,
  "envelope": {
    "attack": 0.1,
    "decay": 0.3,
    "release": 2,
  },
  "filterEnvelope": {
    "attack": 0.001,
    "decay": 0.01,
    "sustain": 0.5,
    "baseFrequency": 200,
    "octaves": 2.6
  }

Snare --------- Beat --------- Piano --------- Bass

var cChord = ["C4", "E4", "G4"];
var gChord = ["B3", "D4", "G4"];
var amChord = ["C4", "E4", "A4"];
var fChord = ["C4", "F4", "A4"];

var piano = new Tone.MonoSynth(4, Tone.Synth),

{
  "volume": -7,
  "oscillator": {
    "partials": [1, 2, 1],
  },
  "envelope": {
    "attack": 0.001,
    "decay": 0.1,
    "sustain": 0.3,
    "release": 1
  },

}).connect(pianoDelay);

var kick = new Tone.MembraneSynth({
  "envelope": {
    "sustain": 0,
    "attack": 0.02,
    "decay": 0.8
  },
  "octaves": 10
}).toMaster();

Lead Melody with freehand drawing

ezgif.com-video-to-gif (4).gif

var leadPaint = new Tone.PolySynth({
  "volume": -10,
  "oscillator": {
    "type": "square"
  },
  "envelope": {
    "attack": 0.2
  },
  "portamento": 0.05

}).connect(leadDelay);

Process

Glove

Wearable with effects coded in the sensors

GLOVE CIRCUIT.jpeg

The effects were coded onto the the flex sensors embedded in the fingers of the glove:

var leadDelay = new Tone.PingPongDelay({
  "delayTime": "8n",
  "maxDelayTime": 1,
  "feedback": 0.82,
  "wet": .40

}).toMaster();

flex_sensor_5_dita_giallo.jpg

var crushPiano = new Tone.BitCrusher(4)
        .receive("crush")
        .toMaster();


      

 

var chebyBass = new Tone.Chebyshev(10)
            .receive("cheby")
            .toMaster();

41012682.jpg

var delayKick = new Tone.FeedbackDelay("4t", 0.38)
            .receive("delayKick")
            .toMaster();
        var delaySnare = new Tone.FeedbackDelay("8t", 0.25)
            .receive("delaySnare")
            .toMaster();

var fft = new Tone.FFT(32);
var spectrum = new Tone.Waveform(1024);

var bassDist = new Tone.Distortion({
  "distortion": 0.4,
  "oversample": '2x'
}).connect(spectrum).toMaster();

Fabrication and Testing

WhatsApp Image 2019-10-21 at 20.01.50.jp

The following videos show:

  • Sewing with conductive thread, FLORA and flex sensors + soldering to connect resistors and wires

  • Serial communication test to calibrate initial sensor values to our code

  • First successful test of 5-Square, played by Max Horwich

bottom of page